37 resultados para Simulation and Modeling


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The information provided by the alignment-independent GRid Independent Descriptors (GRIND) can be condensed by the application of principal component analysis, obtaining a small number of principal properties (GRIND-PP), which is more suitable for describing molecular similarity. The objective of the present study is to optimize diverse parameters involved in the obtention of the GRIND-PP and validate their suitability for applications, requiring a biologically relevant description of the molecular similarity. With this aim, GRIND-PP computed with a collection of diverse settings were used to carry out ligand-based virtual screening (LBVS) on standard conditions. The quality of the results obtained was remarkable and comparable with other LBVS methods, and their detailed statistical analysis allowed to identify the method settings more determinant for the quality of the results and their optimum. Remarkably, some of these optimum settings differ significantly from those used in previously published applications, revealing their unexplored potential. Their applicability in large compound database was also explored by comparing the equivalence of the results obtained using either computed or projected principal properties. In general, the results of the study confirm the suitability of the GRIND-PP for practical applications and provide useful hints about how they should be computed for obtaining optimum results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Earthquakes represent a major hazard for populations around the world, causing frequent loss of life,human suffering and enormous damage to homes, other buildings and infrastructure. The Technology Resources forEarthquake Monitoring and Response (TREMOR) Team of 36 space professionals analysed this problem over thecourse of the International Space University Summer Session Program and published their recommendations in the formof a report. The TREMOR Team proposes a series of space- and ground-based systems to provide improved capabilityto manage earthquakes. The first proposed system is a prototype earthquake early-warning system that improves theexisting knowledge of earthquake precursors and addresses the potential of these phenomena. Thus, the system willat first enable the definitive assessment of whether reliable earthquake early warning is possible through precursormonitoring. Should the answer be affirmative, the system itself would then form the basis of an operational earlywarningsystem. To achieve these goals, the authors propose a multi-variable approach in which the system will combine,integrate and process precursor data from space- and ground-based seismic monitoring systems (already existing andnew proposed systems) and data from a variety of related sources (e.g. historical databases, space weather data, faultmaps). The second proposed system, the prototype earthquake simulation and response system, coordinates the maincomponents of the response phase to reduce the time delays of response operations, increase the level of precisionin the data collected, facilitate communication amongst teams, enhance rescue and aid capabilities and so forth. It isbased in part on an earthquake simulator that will provide pre-event (if early warning is proven feasible) and post-eventdamage assessment and detailed data of the affected areas to corresponding disaster management actors by means of ageographic information system (GIS) interface. This is coupled with proposed mobile satellite communication hubs toprovide links between response teams. Business- and policy-based implementation strategies for these proposals, suchas the establishment of a non-governmental organisation to develop and operate the systems, are included.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Between 1995 and 2005, the Spanish economy grew at an annual average rate higher than 3,5%. Total employment increased by more than 4.9 millions. Most of this growth was in occupations related with university degrees (more than 890,000, 18% of the total employment increase) and vocational qualifications (more than 855,000, 17.5% of the total employment increase). From a sectoral perspective, the main part of this increase took place in “Real estate, renting and business activities” (K sector in NACE rev.1), “Construction” (F sector) and “Health and social sector” (N sector). This paper analyses this employment growth in an Input-output framework, by means of a structural decomposition analysis (SDA). Two kinds of results have been obtained. From a sectoral perspective we decompose employment growth into Labour requirements change, technical change and demand change. From an occupational perspective, we decompose the employment growth in substitutions effect, labour productivity effect and demand effect. The results show that, in aggregated terms, the main part of this growth is attributable to demand growth, with a small technical improvement. But the results also show that this aggregated behaviour hides important sectoral and occupational variation. The purpose of this paper is to contribute to the ongoing debate over productivity growth and what has been called the “growth model” for the Spanish economy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Monte Carlo simulations were used to generate data for ABAB designs of different lengths. The points of change in phase are randomly determined before gathering behaviour measurements, which allows the use of a randomization test as an analytic technique. Data simulation and analysis can be based either on data-division-specific or on common distributions. Following one method or another affects the results obtained after the randomization test has been applied. Therefore, the goal of the study was to examine these effects in more detail. The discrepancies in these approaches are obvious when data with zero treatment effect are considered and such approaches have implications for statistical power studies. Data-division-specific distributions provide more detailed information about the performance of the statistical technique.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The parameterized expectations algorithm (PEA) involves a long simulation and a nonlinear least squares (NLS) fit, both embedded in a loop. Both steps are natural candidates for parallelization. This note shows that parallelization can lead to important speedups for the PEA. I provide example code for a simple model that can serve as a template for parallelization of more interesting models, as well as a download link for an image of a bootable CD that allows creation of a cluster and execution of the example code in minutes, with no need to install any software.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the Department of Chemistry University of North Texas (USA) from September until November 2006. It includes the performance of two computational chemistry studies: an experimental and computational study toward the intra- and intermolecular hydroarylation of isonitriles and the development of an improved catalyst for hydrocarbon functionalization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A long development time is needed from the design to the implementation of an AUV. During the first steps, simulation plays an important role, since it allows for the development of preliminary versions of the control system to be integrated. Once the robot is ready, the control systems are implemented, tuned and tested. The use of a real-time simulator can help closing the gap between off-line simulation and real testing using the already implemented robot. When properly interfaced with the robot hardware, a real-time graphical simulation with a "hardware in the loop" configuration, can allow for the testing of the implemented control system running in the actual robot hardware. Hence, the development time is drastically reduced. These paper overviews the field of graphical simulators used for AUV development proposing a classification. It also presents NEPTUNE, a multi-vehicle, real-time, graphical simulator based on OpenGL that allows hardware in the loop simulations

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A graphical processing unit (GPU) is a hardware device normally used to manipulate computer memory for the display of images. GPU computing is the practice of using a GPU device for scientific or general purpose computations that are not necessarily related to the display of images. Many problems in econometrics have a structure that allows for successful use of GPU computing. We explore two examples. The first is simple: repeated evaluation of a likelihood function at different parameter values. The second is a more complicated estimator that involves simulation and nonparametric fitting. We find speedups from 1.5 up to 55.4 times, compared to computations done on a single CPU core. These speedups can be obtained with very little expense, energy consumption, and time dedicated to system maintenance, compared to equivalent performance solutions using CPUs. Code for the examples is provided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Miralls deformables més i més grans, amb cada cop més actuadors estan sent utilitzats actualment en aplicacions d'òptica adaptativa. El control dels miralls amb centenars d'actuadors és un tema de gran interès, ja que les tècniques de control clàssiques basades en la seudoinversa de la matriu de control del sistema es tornen massa lentes quan es tracta de matrius de dimensions tan grans. En aquesta tesi doctoral es proposa un mètode per l'acceleració i la paral.lelitzacó dels algoritmes de control d'aquests miralls, a través de l'aplicació d'una tècnica de control basada en la reducció a zero del components més petits de la matriu de control (sparsification), seguida de l'optimització de l'ordenació dels accionadors de comandament atenent d'acord a la forma de la matriu, i finalment de la seva posterior divisió en petits blocs tridiagonals. Aquests blocs són molt més petits i més fàcils de fer servir en els càlculs, el que permet velocitats de càlcul molt superiors per l'eliminació dels components nuls en la matriu de control. A més, aquest enfocament permet la paral.lelització del càlcul, donant una com0onent de velocitat addicional al sistema. Fins i tot sense paral. lelització, s'ha obtingut un augment de gairebé un 40% de la velocitat de convergència dels miralls amb només 37 actuadors, mitjançant la tècnica proposada. Per validar això, s'ha implementat un muntatge experimental nou complet , que inclou un modulador de fase programable per a la generació de turbulència mitjançant pantalles de fase, i s'ha desenvolupat un model complert del bucle de control per investigar el rendiment de l'algorisme proposat. Els resultats, tant en la simulació com experimentalment, mostren l'equivalència total en els valors de desviació després de la compensació dels diferents tipus d'aberracions per als diferents algoritmes utilitzats, encara que el mètode proposat aquí permet una càrrega computacional molt menor. El procediment s'espera que sigui molt exitós quan s'aplica a miralls molt grans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report on the study and modeling of the structural and optical properties of rib-loaded waveguides working in the 600-900-nm spectral range. A Si nanocrystal (Si-nc) rich SiO2 layer with nominal Si excess ranging from 10% to 20% was produced by quadrupole ion implantation of Si into thermal SiO2 formed on a silicon substrate. Si-ncs were precipitated by annealing at 1100°C, forming a 0.4-um-thick core layer in the waveguide. The Si content, the Si-nc density and size, the Si-nc emission, and the active layer effective refractive index were determined by dedicated experiments using x-ray photoelectron spectroscopy, Raman spectroscopy, energy-filtered transmission electron microscopy, photoluminescence and m-lines spectroscopy. Rib-loaded waveguides were fabricated by photolithographic and reactive ion etching processes, with patterned rib widths ranging from 1¿to¿8¿¿m. Light propagation in the waveguide was observed and losses of 11dB/cm at 633 and 780 nm were measured, modeled and interpreted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a numerical and partially analytical study of classical particles obeying a Langevin equation that describes diffusion on a surface modeled by a two-dimensional potential. The potential may be either periodic or random. Depending on the potential and the damping, we observe superdiffusion, large-step diffusion, diffusion, and subdiffusion. Superdiffusive behavior is associated with low damping and is in most cases transient, albeit often long. Subdiffusive behavior is associated with highly damped particles in random potentials. In some cases subdiffusive behavior persists over our entire simulation and may be characterized as metastable. In any case, we stress that this rich variety of behaviors emerges naturally from an ordinary Langevin equation for a system described by ordinary canonical Maxwell-Boltzmann statistics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a statistical theory to characterize correlations in weighted networks. We define the appropriate metrics quantifying correlations and show that strictly uncorrelated weighted networks do not exist due to the presence of structural constraints. We also introduce an algorithm for generating maximally random weighted networks with arbitrary P(k,s) to be used as null models. The application of our measures to real networks reveals the importance of weights in a correct understanding and modeling of these heterogeneous systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The bio-economic model "Heures" is a first attempt to develop a simulation procedure to understand the Northwestern Mediterranean fisheries, to evaluate management strategies and to analyze the feasibility of implementing an adaptative management. The model is built on the interaction among three boxes simulating the dynamics of each of the basic actors of a fishery: the stock, the market and the fishermen. A fourth actor, the manager, imposes or modifies the rules, or, in terms of the model, modifies some particular parameters. Thus, the model allows us to simulate and evaluate the mid-term biologic and economic effects of particular management measures. The bio-economic nature of the model is given by the interaction among the three boxes, by the market simulation and, particularly, by the fishermen behaviour. This last element confers to the model its Mediterranean"selfregulated" character. The fishermen allocate their investments to maximize fishing mortality but, having a legal effort limit, they invest in maintenance and technology in order to increase the catchability, which, as a consequence. will be function of the invested capital.