72 resultados para Ray theoretical Model
em Universidad Politécnica de Madrid
Resumo:
We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
This paper introduces a theoretical model for developing integrated degree programmes through e-learning systems as stipulated by a collaboration agreement signed by two universities. We have analysed several collaboration agreements between universities at the national, European, and transatlantic level as well as various e-learning frameworks. A conceptual model, a business model, and the architecture design are presented as part of the theoretical model. The paper presents a way of implementing e-learning systems as a tool to support inter-institutional degree collaborations, from the signing of the collaborative agreement to the implementation of the necessary services. In order to show how the theory can be tested one sample scenario is presented.
Resumo:
Adaptive agents use feedback as a key strategy to cope with un- certainty and change in their environments. The information fed back from the sensorimotor loop into the control subsystem can be used to change four different elements of the controller: parameters associated to the control model, the control model itself, the functional organization of the agent and the functional realization of the agent. There are many change alternatives and hence the complexity of the agent’s space of potential configurations is daunting. The only viable alternative for space- and time-constrained agents —in practical, economical, evolutionary terms— is to achieve a reduction of the dimensionality of this configuration space. Emotions play a critical role in this reduction. The reduction is achieved by func- tionalization, interface minimization and by patterning, i.e. by selection among a predefined set of organizational configurations. This analysis lets us state how autonomy emerges from the integration of cognitive, emotional and autonomic systems in strict functional terms: autonomy is achieved by the closure of functional dependency. Emotion-based morphofunctional systems are able to exhibit complex adaptation patterns at a reduced cognitive cost. In this article we show a general model of how emotion supports functional adaptation and how the emotional biological systems operate following this theoretical model. We will also show how this model is also of applicability to the construction of a wide spectrum of artificial systems1.
Resumo:
Abstract Idea Management Systems are web applications that implement the notion of open innovation though crowdsourcing. Typically, organizations use those kind of systems to connect to large communities in order to gather ideas for improvement of products or services. Originating from simple suggestion boxes, Idea Management Systems advanced beyond collecting ideas and aspire to be a knowledge management solution capable to select best ideas via collaborative as well as expert assessment methods. In practice, however, the contemporary systems still face a number of problems usually related to information overflow and recognizing questionable quality of submissions with reasonable time and effort allocation. This thesis focuses on idea assessment problem area and contributes a number of solutions that allow to filter, compare and evaluate ideas submitted into an Idea Management System. With respect to Idea Management System interoperability the thesis proposes theoretical model of Idea Life Cycle and formalizes it as the Gi2MO ontology which enables to go beyond the boundaries of a single system to compare and assess innovation in an organization wide or market wide context. Furthermore, based on the ontology, the thesis builds a number of solutions for improving idea assessment via: community opinion analysis (MARL), annotation of idea characteristics (Gi2MO Types) and study of idea relationships (Gi2MO Links). The main achievements of the thesis are: application of theoretical innovation models for practice of Idea Management to successfully recognize the differentiation between communities, opinion metrics and their recognition as a new tool for idea assessment, discovery of new relationship types between ideas and their impact on idea clustering. Finally, the thesis outcome is establishment of Gi2MO Project that serves as an incubator for Idea Management solutions and mature open-source software alternatives for the widely available commercial suites. From the academic point of view the project delivers resources to undertake experiments in the Idea Management Systems area and managed to become a forum that gathered a number of academic and industrial partners. Resumen Los Sistemas de Gestión de Ideas son aplicaciones Web que implementan el concepto de innovación abierta con técnicas de crowdsourcing. Típicamente, las organizaciones utilizan ese tipo de sistemas para conectar con comunidades grandes y así recoger ideas sobre cómo mejorar productos o servicios. Los Sistemas de Gestión de Ideas lian avanzado más allá de recoger simplemente ideas de buzones de sugerencias y ahora aspiran ser una solución de gestión de conocimiento capaz de seleccionar las mejores ideas por medio de técnicas colaborativas, así como métodos de evaluación llevados a cabo por expertos. Sin embargo, en la práctica, los sistemas contemporáneos todavía se enfrentan a una serie de problemas, que, por lo general, están relacionados con la sobrecarga de información y el reconocimiento de las ideas de dudosa calidad con la asignación de un tiempo y un esfuerzo razonables. Esta tesis se centra en el área de la evaluación de ideas y aporta una serie de soluciones que permiten filtrar, comparar y evaluar las ideas publicadas en un Sistema de Gestión de Ideas. Con respecto a la interoperabilidad de los Sistemas de Gestión de Ideas, la tesis propone un modelo teórico del Ciclo de Vida de la Idea y lo formaliza como la ontología Gi2MO que permite ir más allá de los límites de un sistema único para comparar y evaluar la innovación en un contexto amplio dentro de cualquier organización o mercado. Por otra parte, basado en la ontología, la tesis desarrolla una serie de soluciones para mejorar la evaluación de las ideas a través de: análisis de las opiniones de la comunidad (MARL), la anotación de las características de las ideas (Gi2MO Types) y el estudio de las relaciones de las ideas (Gi2MO Links). Los logros principales de la tesis son: la aplicación de los modelos teóricos de innovación para la práctica de Sistemas de Gestión de Ideas para reconocer las diferenciasentre comu¬nidades, métricas de opiniones de comunidad y su reconocimiento como una nueva herramienta para la evaluación de ideas, el descubrimiento de nuevos tipos de relaciones entre ideas y su impacto en la agrupación de estas. Por último, el resultado de tesis es el establecimiento de proyecto Gi2MO que sirve como incubadora de soluciones para Gestión de Ideas y herramientas de código abierto ya maduras como alternativas a otros sistemas comerciales. Desde el punto de vista académico, el proyecto ha provisto de recursos a ciertos experimentos en el área de Sistemas de Gestión de Ideas y logró convertirse en un foro que reunión para un número de socios tanto académicos como industriales.
Resumo:
Office automation is one of the fields where the complexity related with technologies and working environments can be best shown. This is the starting point we have chosen to build up a theoretical model that shows us a scene quite different from the one traditionally considered. Through the development of the model, the levels of complexity associated with office automation and office environments have been identified, establishing a relationship between them. Thus, the model allows to state a general principle for sociotechnical design of office automation systems, comprising the ontological distinctions needed to properly evaluate each particular technology and its virtual contribution to office automation. From this fact comes the model's taxonomic ability to draw a global perspective of the state-of-art in office automation technologies.
Resumo:
One of the most challenging problems that must be solved by any theoretical model purporting to explain the competence of the human brain for relational tasks is the one related with the analysis and representation of the internal structure in an extended spatial layout of múltiple objects. In this way, some of the problems are related with specific aims as how can we extract and represent spatial relationships among objects, how can we represent the movement of a selected object and so on. The main objective of this paper is the study of some plausible brain structures that can provide answers in these problems. Moreover, in order to achieve a more concrete knowledge, our study will be focused on the response of the retinal layers for optical information processing and how this information can be processed in the first cortex layers. The model to be reported is just a first trial and some major additions are needed to complete the whole vision process.
Resumo:
The gust wind tunnel at IDR, Universidad Politécnica de Madrid (UPM), has been enhanced and the impact of the modification has been characterized. Several flow quality configurations have been tested. The problems in measuring gusty winds with Pitot tubes have been considered. Experimental results have been obtained and compared with theoretically calculated results (based on potential flow theory). A theoretical correction term has been proposed for unsteady flow measurements obtained with Pitot tubes. The effect of unsteady flow on structures and laying bodies on the ground has been also considered. A theoretical model has been proposed for a semi-circular cylinder and experimental tests have been performed to study the unsteady flow effects, which can help in clarifying the phenomenon.
Resumo:
Asimple semi-empirical model for the aerodynamic behavior of a low-aspect ratio pararotor in autorotation at low Reynolds numbers is presented. The paper is split into three sections: Sec. II deals with the theoretical model derivation, Sec. III deals with the wind-tunnel measurements needed for tuning the theoretical model, and Sec. IV deals with the tuning between the theoretical model and the experimental data. The study is focused on the effect of both the blade pitch angle and the blade roughness and also on the stream velocity, on the rotation velocity, and on the drag of a model. Flow pattern visualizations have also been performed. The value of the free aerodynamic parameters of the semi-empirical model that produces the best fit with the experimental results agrees with the expected ones for the blades at the test conditions. Finally, the model is able to describe the behavior of a pararotor in autorotation that rotates fixed to a shaft, validated for a range of blade pitch angles. The movement of the device is found to be governed by a reduced set of dimensionless parameters.
Resumo:
NPV is a static measure of project value which does not discriminate between levels of internal and external risk in project valuation. Due to current investment project?s characteristics, a much more complex model is needed: one that includes the value of flexibility and the different risk levels associated with variables subject to uncertainty (price, costs, exchange rates, grade and tonnage of the deposits, cut off grade, among many others). Few of these variables present any correlation or can be treated uniformly. In this context, Real Option Valuation (ROV) arose more than a decade ago, as a mainly theoretical model with the potential for simultaneous calculation of the risk associated with such variables. This paper reviews the literature regarding the application of Real Options Valuation in mining, noting the prior focus on external risks, and presents a case study where ROV is applied to quantify risk associated to mine planning.
Resumo:
The objective of this project is to show that the permissible explosive called 20 SR is able to pull out the coal in the normal conditions of blasting in a satisfactory way and to set up the equivalence between the 20 SR and gelatin dynamite (Goma 2 ECO). To achieve this goal some blasting were done, changing the conditions of the blasting and the powder factor for the 20 SR. To analyze the fragmentation base on the analysis of the images of the rock blasted, a commercial software was used. The results from this analysis were compared with the results from the theoretical model for fragmentation created by Kuz – Ram. After all, it was showed that the 20 SR explosive is able to pull out the coal for different coal rock compositions. As the result of this project we can conclude that the 20 SR seems to be able to pull out the coal in normal blasting conditions, using the powder factor as a proportion of the “ballistic mortar” between the two explosives.
Resumo:
El presente proyecto pretende demostrar que el explosivo de seguridad 20 SR es capaz de arrancar el carbón de forma satisfactoria en las condiciones de disparo habituales y establecer la equivalencia práctica de dicho explosivo con una dinamita gelatinosa (Goma 2ECO). Para conseguir este objetivo se realizaron una serie de voladuras, variando las condiciones de disparo y los consumos específicos de la dinamita de seguridad. Se utilizó un software de análisis fotográfico para el estudio de la fragmentación en la pila y también se compararon los resultados obtenidos con el modelo teórico de fragmentación de Kuz – Ram. Los resultados demostraron la capacidad de arranque de la dinamita de seguridad, para diferentes composiciones de carbón. Del estudio parece deducirse que la dinamita de seguridad 20 SR es capaz de arrancar el carbón en condiciones de disparo habituales utilizando un consumo específico proporcional a la relación de la potencia del péndulo balístico de ambos explosivos. ABSTRACT The objective of this project is to show that the permissible explosive called 20 SR is able to pull out the coal in the normal conditions of blasting in a satisfactory way and to set up the equivalence between the 20 SR and gelatin dynamite (Goma 2 ECO). To achieve this goal some blasting were done, changing the conditions of the blasting and the powder factor for the 20 SR. To analyze the fragmentation base on the analysis of the images of the rock blasted, a commercial software was used. The results from this analysis were compared with the results from the theoretical model for fragmentation created by Kuz – Ram. After all, it was showed that the 20 SR explosive is able to pull out the coal for different coal rock compositions. As the result of this project we can conclude that the 20 SR seems to be able to pull out the coal in normal blasting conditions, using the powder factor as a proportion of the “ballistic mortar” between the two explosives.
Resumo:
The discretionality and the appraisers’ subjectivity that characterize traditional real estate valuation are still allowed to take part in the formation of the asset price even when respecting international standards (EVS, IVS) or Appraisal Institution´s regulations (TEGOVA, RICS, etc.). The application of econometric and statistical methods to real estate valuation aims at the elimination of subjectivity on the appraisal process. But the unanswered question underneath this subject is the following: How important is the subjective component on real estate appraisal value formation? On this study Structural Equation Models (SEM) are used to determine the importance of the objective and subjective components on real estate valuation value formation as well as the weight of economic factors and the current economic context on real estate appraisal for mortgage purposes price formation. There were used two latent variables, Objective Component and Subjective Component, witch aggregate objective observed variables and subjective observed and unobserved variables, respectively. Factorial Exploratory Analysis is the statistical technique used in order to link the observed variables extracted from the valuation appraisal reports to the latent constructs derived from the theoretical model. SEM models were used to refine the model, eliminate non‐significant variables and to determine the weight of Objective and Subjective latent variables. These techniques were applied to a sample of over 11.000 real estate assets appraisal reports throughout the time period between November of 2006 and April of 2012. The real assets used on this study are located on Lisbon’s Metropolitan Area – “Grande Lisboa” –, Portugal. From this study, we conclude that Subjective Component has a considerable weight on real estate appraisal value formation and that the external factor Economic Situation has a very small impact on real estate appraisal value formation.
Resumo:
The linear stability analysis of accelerated double ablation fronts is carried out numerically with a self-consistent approach. Accurate hydrodynamic profiles are taken into account in the theoretical model by means of a fitting parameters method using 1D simulation results. Numerical dispersión relation is compared to an analytical sharp boundary model [Yan˜ez et al., Phys. Plasmas 18, 052701 (2011)] showing an excellent agreement for the radiation dominated regime of very steep ablation fronts, and the stabilization due to smooth profiles. 2D simulations are presented to validate the numerical self-consistent theory.
Resumo:
Ionoluminescence of α - quartz exhibits two dominant emission bands peaking at 1.9 eV. (NBOHCs) and 2.7 eV (STEs. The evolution of the red emission yield does not show a correlation with the concentrations of neither the NBOHC nor with that of other color centers. The blue emission yield closely follows the amorphization kinetics independently measured by RBS/C spectrometry. A simple theoretical model has been proposed; it assumes that the formation and recombination of STEs are the primary event and both, the light emissions and the lattice structural damage are a consequence this phenomenon. The model leads to several simple mathematical equations that can be used to simulate the IL yields and provide a reasonable fit to experimental kinetic data.
Resumo:
The wake produced by the structural supports of the ultrasonic anemometers (UAs)causes distortions in the velocity field in the vicinity of the sonic path. These distortions are measured by the UA, inducing errors in the determination of the mean velocity, turbulence intensity, spectrum, etc.; basic parameters to determine the effect of wind on structures. Additionally, these distortions can lead to indefinition in the calibration function of the sensors (Cuerva et al., 2004). Several wind tunnel tests have been dedicated to obtaining experimental data, from which have been developed fit models to describe and to correct these distortions (Kaimal, 1978 and Wyngaard, 1985). This work explores the effect of a vortex wake generated by the supports of an UA, on the measurement of wind speed done by this instrument. To do this, the Von Karman¿s vortex street potential model is combined with the mathematical model of the measuring process carried out by UAs developed by Franchini et al. (2007). The obtained results are the correction functions of the measured wind velocity, which depends on the geometry of the sonic anemometer and aerodynamic conditions. These results have been validated with the ones obtained in a wind tunnel test done on a single path UA, especially developed for research. The supports of this UA have been modified in order to reproduce the conditions of the theoretical model. Good agreements between experimental and theoretical results have been found.