987 resultados para scientific explanations models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivado por los últimos hallazgos realizados gracias a los recientes avances tecnológicos y misiones espaciales, el estudio de los asteroides ha despertado el interés de la comunidad científica. Tal es así que las misiones a asteroides han proliferado en los últimos años (Hayabusa, Dawn, OSIRIX-REx, ARM, AIMS-DART, ...) incentivadas por su enorme interés científico. Los asteroides son constituyentes fundamentales en la evolución del Sistema Solar, son además grandes concentraciones de valiosos recursos naturales, y también pueden considerarse como objectivos estratégicos para la futura exploración espacial. Desde hace tiempo se viene especulando con la posibilidad de capturar objetos próximos a la Tierra (NEOs en su acrónimo anglosajón) y acercarlos a nuestro planeta, permitiendo así un acceso asequible a los mismos para estudiarlos in-situ, explotar sus recursos u otras finalidades. Por otro lado, las asteroides se consideran con frecuencia como posibles peligros de magnitud planetaria, ya que impactos de estos objetos con la Tierra suceden constantemente, y un asteroide suficientemente grande podría desencadenar eventos catastróficos. Pese a la gravedad de tales acontecimientos, lo cierto es que son ciertamente difíciles de predecir. De hecho, los ricos aspectos dinámicos de los asteroides, su modelado complejo y las incertidumbres observaciones hacen que predecir su posición futura con la precisión necesaria sea todo un reto. Este hecho se hace más relevante cuando los asteroides sufren encuentros próximos con la Tierra, y más aún cuando estos son recurrentes. En tales situaciones en las cuales fuera necesario tomar medidas para mitigar este tipo de riesgos, saber estimar con precisión sus trayectorias y probabilidades de colisión es de una importancia vital. Por ello, se necesitan herramientas avanzadas para modelar su dinámica y predecir sus órbitas con precisión, y son también necesarios nuevos conceptos tecnológicos para manipular sus órbitas llegado el caso. El objetivo de esta Tesis es proporcionar nuevos métodos, técnicas y soluciones para abordar estos retos. Las contribuciones de esta Tesis se engloban en dos áreas: una dedicada a la propagación numérica de asteroides, y otra a conceptos de deflexión y captura de asteroides. Por lo tanto, la primera parte de este documento presenta novedosos avances de apliación a la propagación dinámica de alta precisión de NEOs empleando métodos de regularización y perturbaciones, con especial énfasis en el método DROMO, mientras que la segunda parte expone ideas innovadoras para la captura de asteroides y comenta el uso del “ion beam shepherd” (IBS) como tecnología para deflectarlos. Abstract Driven by the latest discoveries enabled by recent technological advances and space missions, the study of asteroids has awakened the interest of the scientific community. In fact, asteroid missions have become very popular in the recent years (Hayabusa, Dawn, OSIRIX-REx, ARM, AIMS-DART, ...) motivated by their outstanding scientific interest. Asteroids are fundamental constituents in the evolution of the Solar System, can be seen as vast concentrations of valuable natural resources, and are also considered as strategic targets for the future of space exploration. For long it has been hypothesized with the possibility of capturing small near-Earth asteroids and delivering them to the vicinity of the Earth in order to allow an affordable access to them for in-situ science, resource utilization and other purposes. On the other side of the balance, asteroids are often seen as potential planetary hazards, since impacts with the Earth happen all the time, and eventually an asteroid large enough could trigger catastrophic events. In spite of the severity of such occurrences, they are also utterly hard to predict. In fact, the rich dynamical aspects of asteroids, their complex modeling and observational uncertainties make exceptionally challenging to predict their future position accurately enough. This becomes particularly relevant when asteroids exhibit close encounters with the Earth, and more so when these happen recurrently. In such situations, where mitigation measures may need to be taken, it is of paramount importance to be able to accurately estimate their trajectories and collision probabilities. As a consequence, advanced tools are needed to model their dynamics and accurately predict their orbits, as well as new technological concepts to manipulate their orbits if necessary. The goal of this Thesis is to provide new methods, techniques and solutions to address these challenges. The contributions of this Thesis fall into two areas: one devoted to the numerical propagation of asteroids, and another to asteroid deflection and capture concepts. Hence, the first part of the dissertation presents novel advances applicable to the high accuracy dynamical propagation of near-Earth asteroids using regularization and perturbations techniques, with a special emphasis in the DROMO method, whereas the second part exposes pioneering ideas for asteroid retrieval missions and discusses the use of an “ion beam shepherd” (IBS) for asteroid deflection purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early ancestors of crop simulation models (De Wit, 1965; Monteith, 1965; Duncan et al., 1967) were born before primitive personal computers were available (e.g. Apple II released in 1977, IBM PC released in 1981). Paleo-computer programs were run in mainframes with the support of punch cards. As computers became more available and powerful, crop models evolved into sophisticated tools summarizing our understanding of how crops operate. This evolution was triggered by the need to answer new scientific questions and improve the accuracy of model simulations, especially under limiting conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La reproducibilidad de estudios y resultados científicos es una meta a tener en cuenta por cualquier científico a la hora de publicar el producto de una investigación. El auge de la ciencia computacional, como una forma de llevar a cabo estudios empíricos haciendo uso de modelos matemáticos y simulaciones, ha derivado en una serie de nuevos retos con respecto a la reproducibilidad de dichos experimentos. La adopción de los flujos de trabajo como método para especificar el procedimiento científico de estos experimentos, así como las iniciativas orientadas a la conservación de los datos experimentales desarrolladas en las últimas décadas, han solucionado parcialmente este problema. Sin embargo, para afrontarlo de forma completa, la conservación y reproducibilidad del equipamiento computacional asociado a los flujos de trabajo científicos deben ser tenidas en cuenta. La amplia gama de recursos hardware y software necesarios para ejecutar un flujo de trabajo científico hace que sea necesario aportar una descripción completa detallando que recursos son necesarios y como estos deben de ser configurados. En esta tesis abordamos la reproducibilidad de los entornos de ejecución para flujos de trabajo científicos, mediante su documentación usando un modelo formal que puede ser usado para obtener un entorno equivalente. Para ello, se ha propuesto un conjunto de modelos para representar y relacionar los conceptos relevantes de dichos entornos, así como un conjunto de herramientas que hacen uso de dichos módulos para generar una descripción de la infraestructura, y un algoritmo capaz de generar una nueva especificación de entorno de ejecución a partir de dicha descripción, la cual puede ser usada para recrearlo usando técnicas de virtualización. Estas contribuciones han sido aplicadas a un conjunto representativo de experimentos científicos pertenecientes a diferentes dominios de la ciencia, exponiendo cada uno de ellos diferentes requisitos hardware y software. Los resultados obtenidos muestran la viabilidad de propuesta desarrollada, reproduciendo de forma satisfactoria los experimentos estudiados en diferentes entornos de virtualización. ABSTRACT Reproducibility of scientific studies and results is a goal that every scientist must pursuit when announcing research outcomes. The rise of computational science, as a way of conducting empirical studies by using mathematical models and simulations, have opened a new range of challenges in this context. The adoption of workflows as a way of detailing the scientific procedure of these experiments, along with the experimental data conservation initiatives that have been undertaken during last decades, have partially eased this problem. However, in order to fully address it, the conservation and reproducibility of the computational equipment related to them must be also considered. The wide range of software and hardware resources required to execute a scientific workflow implies that a comprehensive description detailing what those resources are and how they are arranged is necessary. In this thesis we address the issue of reproducibility of execution environments for scientific workflows, by documenting them in a formalized way, which can be later used to obtain and equivalent one. In order to do so, we propose a set of semantic models for representing and relating the relevant information of those environments, as well as a set of tools that uses these models for generating a description of the infrastructure, and an algorithmic process that consumes these descriptions for deriving a new execution environment specification, which can be enacted into a new equivalent one using virtualization solutions. We apply these three contributions to a set of representative scientific experiments, belonging to different scientific domains, and exposing different software and hardware requirements. The obtained results prove the feasibility of the proposed approach, by successfully reproducing the target experiments under different virtualization environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El propósito de esta tesis es estudiar la aproximación a los fenómenos de transporte térmico en edificación acristalada a través de sus réplicas a escala. La tarea central de esta tesis es, por lo tanto, la comparación del comportamiento térmico de modelos a escala con el correspondiente comportamiento térmico del prototipo a escala real. Los datos principales de comparación entre modelo y prototipo serán las temperaturas. En el primer capítulo del Estado del Arte de esta tesis se hará un recorrido histórico por los usos de los modelos a escala desde la antigüedad hasta nuestro días. Dentro de éste, en el Estado de la Técnica, se expondrán los beneficios que tiene su empleo y las dificultades que conllevan. A continuación, en el Estado de la Investigación de los modelos a escala, se analizarán artículos científicos y tesis. Precisamente, nos centraremos en aquellos modelos a escala que son funcionales. Los modelos a escala funcionales son modelos a escala que replican, además, una o algunas de las funciones de sus prototipos. Los modelos a escala pueden estar distorsionados o no. Los modelos a escala distorsionados son aquellos con cambios intencionados en las dimensiones o en las características constructivas para la obtención de una respuesta específica por ejemplo, replicar el comportamiento térmico. Los modelos a escala sin distorsión, o no distorsionados, son aquellos que mantienen, en la medida de lo posible, las proporciones dimensionales y características constructivas de sus prototipos de referencia. Estos modelos a escala funcionales y no distorsionados son especialmente útiles para los arquitectos ya que permiten a la vez ser empleados como elementos funcionales de análisis y como elementos de toma de decisiones en el diseño constructivo. A pesar de su versatilidad, en general, se observará que se han utilizado muy poco estos modelos a escala funcionales sin distorsión para el estudio del comportamiento térmico de la edificación. Posteriormente, se expondrán las teorías para el análisis de los datos térmicos recogidos de los modelos a escala y su aplicabilidad a los correspondientes prototipos a escala real. Se explicarán los experimentos llevados a cabo, tanto en laboratorio como a intemperie. Se han realizado experimentos con modelos sencillos cúbicos a diferentes escalas y sometidos a las mismas condiciones ambientales. De estos modelos sencillos hemos dado el salto a un modelo reducido de una edificación acristalada relativamente sencilla. Los experimentos consisten en ensayos simultáneos a intemperie del prototipo a escala real y su modelo reducido del Taller de Prototipos de la Escuela Técnica Superior de Arquitectura de Madrid (ETSAM). Para el análisis de los datos experimentales hemos aplicado las teorías conocidas, tanto comparaciones directas como el empleo del análisis dimensional. Finalmente, las simulaciones nos permiten comparaciones flexibles con los datos experimentales, por ese motivo, hemos utilizado tanto programas comerciales como un algoritmo de simulación desarrollado ad hoc para esta investigación. Finalmente, exponemos la discusión y las conclusiones de esta investigación. Abstract The purpose of this thesis is to study the approximation to phenomena of heat transfer in glazed buildings through their scale replicas. The central task of this thesis is, therefore, the comparison of the thermal performance of scale models without distortion with the corresponding thermal performance of their full-scale prototypes. Indoor air temperatures of the scale model and the corresponding prototype are the data to be compared. In the first chapter on the State of the Art, it will be shown a broad vision, consisting of a historic review of uses of scale models, from antiquity to our days. In the section State of the Technique, the benefits and difficulties associated with their implementation are presented. Additionally, in the section State of the Research, current scientific papers and theses on scale models are reviewed. Specifically, we focus on functional scale models. Functional scale models are scale models that replicate, additionally, one or some of the functions of their corresponding prototypes. Scale models can be distorted or not. Scale models with distortion are considered scale models with intentional changes, on one hand, in dimensions scaled unevenly and, on the other hand, in constructive characteristics or materials, in order to get a specific performance for instance, a specific thermal performance. Consequently, scale models without distortion, or undistorted scale models scaled evenly, are those replicating, to the extent possible, without distortion, the dimensional proportions and constructive configurations of their prototypes of reference. These undistorted and functional scale models are especially useful for architects because they can be used, simultaneously, as functional elements of analysis and as decision-making elements during the design. Although they are versatile, in general, it is remarkable that these types of models are used very little for the study of the thermal performance of buildings. Subsequently, the theories related to the analysis of the experimental thermal data collected from the scale models and their applicability to the corresponding full-scale prototypes, will be explained. Thereafter, the experiments in laboratory and at outdoor conditions are detailed. Firstly, experiments carried out with simple cube models at different scales are explained. The prototype larger in size and the corresponding undistorted scale model have been subjected to same environmental conditions in every experimental test. Secondly, a step forward is taken carrying out some simultaneous experimental tests of an undistorted scale model, replica of a relatively simple lightweight and glazed building construction. This experiment consists of monitoring the undistorted scale model of the prototype workshop located in the School of Architecture (ETSAM) of the Technical University of Madrid (UPM). For the analysis of experimental data, known related theories and resources are applied, such as, direct comparisons, statistical analyses, Dimensional Analysis and last, but not least important, simulations. Simulations allow us, specifically, flexible comparisons with experimental data. Here, apart the use of the simulation software EnergyPlus, a simulation algorithm is developed ad hoc for this research. Finally, the discussion and conclusions of this research are exposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Mouse Tumor Biology (MTB) Database serves as a curated, integrated resource for information about tumor genetics and pathology in genetically defined strains of mice (i.e., inbred, transgenic and targeted mutation strains). Sources of information for the database include the published scientific literature and direct data submissions by the scientific community. Researchers access MTB using Web-based query forms and can use the database to answer such questions as ‘What tumors have been reported in transgenic mice created on a C57BL/6J background?’, ‘What tumors in mice are associated with mutations in the Trp53 gene?’ and ‘What pathology images are available for tumors of the mammary gland regardless of genetic background?’. MTB has been available on the Web since 1998 from the Mouse Genome Informatics web site (http://www.informatics.jax.org). We have recently implemented a number of enhancements to MTB including new query options, redesigned query forms and results pages for pathology and genetic data, and the addition of an electronic data submission and annotation tool for pathology data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of liquid silicon infiltration is investigated for channels with radii from 0.25 to 0.75 [mm] drilled in compact carbon preforms. The advantage of this setup is that the study of the phenomenon results to be simplified. For comparison purposes, attempts are made in order to work out a framework for evaluating the accuracy of simulations. The approach relies on dimensionless numbers involving the properties of the surface reaction. It turns out that complex hydrodynamic behavior derived from second Newton law can be made consistent with Lattice-Boltzmann simulations. The experiments give clear evidence that the growth of silicon carbide proceeds in two different stages and basic mechanisms are highlighted. Lattice-Boltzmann simulations prove to be an effective tool for the description of the growing phase. Namely, essential experimental constraints can be implemented. As a result, the existing models are useful to gain more insight on the process of reactive infiltration into porous media in the first stage of penetration, i.e. up to pore closure because of surface growth. A way allowing to implement the resistance from chemical reaction in Darcy law is also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the role of social model features in the economic performance of Italy and Spain during the run-up to the Eurozone crisis, as well as the consequences of that crisis, in turn, for the two countries social models. It takes issue with the prevailing view - what I refer to as the “competitiveness thesis” - which attributes the debtor status of the two countries to a lack of competitive capacity rooted in social model features. This competitiveness thesis has been key in justifying the “liberalization plus austerity” measures that European institutions have demanded in return for financial support for Italy and Spain at critical points during the crisis. The paper challenges this prevailing wisdom. First, it reviews the characteristics of the Italian and Spanish social models and their evolution in the period prior to the crisis, revealing a far more complex, dynamic and differentiated picture than is given in the political economy literature. Second, the paper considers various ways in which social model characteristics are said to have contributed to the Eurozone crisis, finding such explanations wanting. Italy and Spain ́s debtor status was primarily the result of much broader dynamics in the Euro- zone, including capital flows from richer to poorer countries that affected economic demand, with social model features playing, at most, an ancillary role. More aggressive reforms responding to EU demands in Spain may have increased the long term social and economic costs of the crisis, whereas the political stalemate that slowed such reforms in Italy may have paradoxically mitigated these costs. The comparison of the two countries thus suggests that, in the absence of broader macro-institutional reform of the Eurozone, compliance with EU dictates may have had perverse effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este estudio tiene como objetivo estimar la influencia del acceso abierto en los patrones de publicación de la comunidad científica argentina en diferentes campos temáticos (Medicina; Física y Astronomía; Agricultura y Ciencias biológicas y Ciencias sociales y Humanidades), a partir del análisis del modelo de acceso de las revistas elegidas para comunicar los resultados de investigación en el período 2008-2010. La producción fue recogida de la base de datos SCOPUS y los modelos de acceso de las revistas determinados a partir de la consulta a las fuentes DOAJ, e-revist@s, SCielo, Redalyc, PubMed, Romeo-Sherpa y Dulcinea. Se analizó la accesibilidad real y potencial de la producción científica nacional por las vías dorada y verde, respectivamente, así como también por suscripción a través de la Biblioteca Electrónica de Ciencia y Tecnología del Ministerio de Ciencia, Tecnología e Innovación Productiva de la Nación Argentina. Los resultados muestran que en promedio, y para el conjunto de las temáticas estudiadas, el 70 de la producción científica argentina visible en SCOPUS se publica en revistas que adhieren de una u otra forma al movimiento de acceso abierto, en una relación del 27 para la vía dorada y del 43 para las que permiten el autoarchivo por la vía verde. Entre el 16 y el 30 (según las áreas temáticas) de los artículos publicados en revistas que permiten el autoarchivo se accede vía suscripción. El porcentaje de revistas sin acceso es del orden del 30 en Ciencias Sociales y Humanidades, y alcanza cerca del 45 en el resto de las áreas. Se concluye que Argentina presenta condiciones muy favorables para liberar un alto porcentaje de la literatura científica generada en el país bajo la modalidad del acceso abierto a través de repositorios institucionales y de mandatos para el auto-archivo, contribuyendo además a incrementar la accesibilidad y la preservación a largo plazo de la producción científica y tecnológica nacional

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Control Engineering is an essential part of university electrical engineering education. Normally, a control course requires considerable mathematical as well as engineering knowledge and is consequently regarded as a difficult course by many undergraduate students. From the academic point of view, how to help the students to improve their learning of the control engineering knowledge is therefore an important task which requires careful planning and innovative teaching methods. Traditionally, the didactic teaching approach has been used to teach the students the concepts needed to solve control problems. This approach is commonly adopted in many mathematics intensive courses; however it generally lacks reflection from the students to improve their learning. This paper addresses the practice of action learning and context-based learning models in teaching university control courses. This context-based approach has been practised in teaching several control engineering courses in a university with promising results, particularly in view of student learning performances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review the recent progress on the construction of the determinant representations of the correlation functions for the integrable supersymmetric fermion models. The factorizing F-matrices (or the so-called F-basis) play an important role in the construction. In the F-basis, the creation (and the annihilation) operators and the Bethe states of the integrable models are given in completely symmetric forms. This leads to the determinant representations of the scalar products of the Bethe states for the models. Based on the scalar products, the determinant representations of the correlation functions may be obtained. As an example, in this review, we give the determinant representations of the two-point correlation function for the U-q(gl(2 vertical bar 1)) (i.e. q-deformed) supersymmetric t-J model. The determinant representations are useful for analyzing physical properties of the integrable models in the thermodynamical limit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.