16 resultados para Runs of homozygosity


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estimates of effective population size in the Holstein cattle breed have usually been low despite the large number of animals that constitute this breed. Effective population size is inversely related to the rates at which coancestry and inbreeding increase and these rates have been high as a consequence of intense and accurate selection. Traditionally, coancestry and inbreeding coefficients have been calculated from pedigree data. However, the development of genome-wide single nucleotide polymorphisms has increased the interest of calculating these coefficients from molecular data in order to improve their accuracy. In this study, genomic estimates of coancestry, inbreeding and effective population size were obtained in the Spanish Holstein population and then compared with pedigree-based estimates. A total of 11,135 animals genotyped with the Illumina BovineSNP50 BeadChip were available for the study. After applying filtering criteria, the final genomic dataset included 36,693 autosomal SNPs and 10,569 animals. Pedigree data from those genotyped animals included 31,203 animals. These individuals represented only the last five generations in order to homogenise the amount of pedigree information across animals. Genomic estimates of coancestry and inbreeding were obtained from identity by descent segments (coancestry) or runs of homozygosity (inbreeding). The results indicate that the percentage of variance of pedigree-based coancestry estimates explained by genomic coancestry estimates was higher than that for inbreeding. Estimates of effective population size obtained from genome-wide and pedigree information were consistent and ranged from about 66 to 79. These low values emphasize the need of controlling the rate of increase of coancestry and inbreeding in Holstein selection programmes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work explores the automatic recognition of physical activity intensity patterns from multi-axial accelerometry and heart rate signals. Data collection was carried out in free-living conditions and in three controlled gymnasium circuits, for a total amount of 179.80 h of data divided into: sedentary situations (65.5%), light-to-moderate activity (17.6%) and vigorous exercise (16.9%). The proposed machine learning algorithms comprise the following steps: time-domain feature definition, standardization and PCA projection, unsupervised clustering (by k-means and GMM) and a HMM to account for long-term temporal trends. Performance was evaluated by 30 runs of a 10-fold cross-validation. Both k-means and GMM-based approaches yielded high overall accuracy (86.97% and 85.03%, respectively) and, given the imbalance of the dataset, meritorious F-measures (up to 77.88%) for non-sedentary cases. Classification errors tended to be concentrated around transients, what constrains their practical impact. Hence, we consider our proposal to be suitable for 24 h-based monitoring of physical activity in ambulatory scenarios and a first step towards intensity-specific energy expenditure estimators

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this work was to study the effect of two technical modifications (supplemented with sponge materials (ES) and provided with a filter system (FIL))in continuous-culture fermenters on the microbial populations and ruminal fermentation parameters over the sampling period. Six fermenters fed a 50:50 alfalfa hay: concentrate diet, inoculated with rumen liquor from sheep fed the same diet, were used in two incubation runs of 14 days each. On days 10 and 14, samples were taken for analysis of fermentation parameters (volatile fatty acids, ammonia-N and lactate) and microbial populations. None of the technical modification affected (P>0.05) concentrations of bacterial DNA and the relative abundance of fungi and archaea, but protozoal DNA concentrations were higher (P>0.05) in ES and FIL fermenters than in the control ones. However, values of protozoal DNA were about 50 times lower than in the rumen fluid used as inoculum for the ermenters. The tested technical modifications did not affect (P>0.05) any fermentation parameter, and there were no differences in fermentation parameters between days 10 and 14, with the exception of lactate production which was higher (P=0.009) on day 14 than on day 10. In conclusion, the technical modifications tested maintained protozoa in continuous culture fermenters without any effect on fermentation parameters and other microbial populations, but protozoa concentrations were still lower than those in the rumen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methodology and results of full scale maneuvering trials for Riverine Support Patrol Vessel “RSPV”, built by COTECMAR for the Colombian Navy are presented. !is ship is equipped with a “Pump – Jet” propulsion system and the hull corresponds to a wide-hull with a high Beam – Draft ratio (B/T=9.5). Tests were based on the results of simulation of turning diameters obtained from TRIBON M3© design software, applying techniques of Design of Experiments “DOE”, to rationalize the number of runs in di"erent conditions of water depth, ship speed, and rudder angle. Results validate the excellent performance of this class of ship and show that turning diameter and other maneuvering characteristics improve with decreasing water depth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an automatic modulation classifier for electronic warfare applications. It is a pattern recognition modulation classifier based on statistical features of the phase and instantaneous frequency. This classifier runs in a real time operation mode with sampling rates in excess of 1 Gsample/s. The hardware platform for this application is a Field Programmable Gate Array (FPGA). This AMC is subsidiary of a digital channelised receiver also implemented in the same platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DynaLearn (http://www.DynaLearn.eu) develops a cognitive artefact that engages learners in an active learning by modelling process to develop conceptual system knowledge. Learners create external representations using diagrams. The diagrams capture conceptual knowledge using the Garp3 Qualitative Reasoning (QR) formalism [2]. The expressions can be simulated, confronting learners with the logical consequences thereof. To further aid learners, DynaLearn employs a sequence of knowledge representations (Learning Spaces, LS), with increasing complexity in terms of the modelling ingredients a learner can use [1]. An online repository contains QR models created by experts/teachers and learners. The server runs semantic services [4] to generate feedback at the request of learners via the workbench. The feedback is communicated to the learner via a set of virtual characters, each having its own competence [3]. A specific feedback thus incorporates three aspects: content, character appearance, and a didactic setting (e.g. Quiz mode). In the interactive event we will demonstrate the latest achievements of the DynaLearn project. First, the 6 learning spaces for learners to work with. Second, the generation of feedback relevant to the individual needs of a learner using Semantic Web technology. Third, the verbalization of the feedback via different animated virtual characters, notably: Basic help, Critic, Recommender, Quizmaster & Teachable agen

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A local proper orthogonal decomposition (POD) plus Galerkin projection method was recently developed to accelerate time dependent numerical solvers of PDEs. This method is based on the combined use of a numerical code (NC) and a Galerkin sys- tem (GS) in a sequence of interspersed time intervals, INC and IGS, respectively. POD is performed on some sets of snapshots calculated by the numerical solver in the INC inter- vals. The governing equations are Galerkin projected onto the most energetic POD modes and the resulting GS is time integrated in the next IGS interval. The major computa- tional e®ort is associated with the snapshots calculation in the ¯rst INC interval, where the POD manifold needs to be completely constructed (it is only updated in subsequent INC intervals, which can thus be quite small). As the POD manifold depends only weakly on the particular values of the parameters of the problem, a suitable library can be con- structed adapting the snapshots calculated in other runs to drastically reduce the size of the ¯rst INC interval and thus the involved computational cost. The strategy is success- fully tested in (i) the one-dimensional complex Ginzburg-Landau equation, including the case in which it exhibits transient chaos, and (ii) the two-dimensional unsteady lid-driven cavity problem

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La Directiva 2003/10/CE del Parlamento Europeo y del Consejo, del 6 de febrero de 2003, específica con arreglo al apartado 1 del artículo 16 de la Directiva 89/391/CEE las disposiciones mínimas de seguridad y de salud relativas a la exposición de los trabajadores a los riesgos derivados de los agentes físicos (ruido). En la industria musical, y en concreto en los músicos de orquesta, una exposición de más de ocho horas al día a un nivel de presión sonora de 80dB(A) o más es algo muy común. Esta situación puede causar a los trabajadores daños auditivos como la hiperacusia, hipoacusia, tinitus o ruptura de la membrana basilar entre otros. Esto significa que deben tomarse medidas para implementar las regulaciones de la forma más razonable posible para que la interpretación del músico, la dinámica y el concepto musical que se quiere transmitir al público se vea lo menos afectada posible. Para reducir la carga auditiva de los músicos de orquesta frente a fuertes impactos sonoros provenientes de los instrumentos vecinos, se está investigando sobre el uso de unos paneles acústicos que colocados en puntos estratégicos de la orquesta pueden llegar a reducir el impacto sonoro sobre el oído hasta 20dB. Los instrumentos de viento metal y de percusión son los responsables de la mayor emisión de presión sonora. Para proteger el oído de los músicos frente a estos impactos, se colocan los paneles en forma de barrera entre dichos instrumentos y los músicos colocados frente a ellos. De esta forma se protege el oído de los músicos más afectados. Para ver el efecto práctico que producen estos paneles en un conjunto orquestal, se realizan varias grabaciones en los ensayos y conciertos de varias orquestas. Los micrófonos se sitúan a la altura del oído y a una distancia de no más de 10cm de la oreja de varios de los músicos más afectados y de los músicos responsables de la fuerte emisión sonora. De este modo se puede hacer una comparación de los niveles de presión sonora que percibe cada músico y evaluar las diferencias de nivel existentes entre ambos. Así mismo se utilizan configuraciones variables de los paneles para comparar las diferencias de presión sonora que existen entre las distintas posibilidades de colocarlos y decidir así sobre la mejor ubicación y configuración de los mismos. A continuación, una vez obtenidos las muestras de audio y los diferentes archivos de datos medidos con un analizador de audio en distintas posiciones de la orquesta, todo ello se calibra y analiza utilizando un programa desarrollado en Matlab, para evaluar el efecto de los paneles sobre la percepción auditiva de los músicos, haciendo especial hincapié en el análisis de las diferencias de nivel de presión sonora (SPL). Mediante el cálculo de la envolvente de las diferencias de nivel, se evalúa de un modo estadístico el efecto de atenuación de los paneles acústicos en los músicos de orquesta. El método está basado en la probabilidad estadística de varias muestras musicales ya que al tratarse de música tocada en directo, la dinámica y la sincronización entre los músicos varía según el momento en que se toque. Estos factores junto con el hecho de que la partitura de cada músico es diferente dificulta la comparación entre dos señales grabadas en diferentes puntos de la orquesta. Se necesita por lo tanto de varias muestras musicales para evaluar el efecto de atenuación de los paneles en las distintas configuraciones mencionadas anteriormente. El estudio completo del efecto de los paneles como entorno que influye en los músicos de orquesta cuando están sobre el escenario, tiene como objetivo la mejora de sus condiciones de trabajo. Abstract For several years, the European Union has been adopting many laws and regulations to protect and give more security to people who are exposed to some risk in their job. Being exposed to a loud sound pressure level during many hours in the job runs the risk of hearing damage. Particularly in the field of music, the ear is the most important working tool. Not taking care of the ear can cause some damage such as hearing loss, tinnitus, hyperacusis, diplacusis, etc. This could have an impact on the efficiency and satisfaction of the musicians when they are playing, which could also cause stress problems. Orchestra musicians, as many other workers in this sector, are usually exposed to a sound level of 80dB(A) or more during more than eight hours per day. It means that they must satisfy the law and their legal obligations to avoid health problems proceeding from their job. Putting into practice the new regulations is a challenge for orchestras. They must make sure that the repertoire, with its dynamic, balance and feeling, is not affected by the reduction of sound levels imposed by the law. This study tries to investigate the benefits and disadvantages of using shields as a hearing protector during rehearsals and orchestral concerts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fundamental objective of this Ph. D. dissertation is to demonstrate that, under particular circumstances which cover most of the structures with practical interest, periodic structures can be understood and analyzed by means of closed waveguide theories and techniques. To that aim, in the first place a transversely periodic cylindrical structure is considered and the wave equation, under a combination of perfectly conducting and periodic boundary conditions, is studied. This theoretical study runs parallel to the classic analysis of perfectly conducting closed waveguides. Under the light shed by the aforementioned study it is clear that, under certain very common periodicity conditions, transversely periodic cylindrical structures share a lot of properties with closed waveguides. Particularly, they can be characterized by a complete set of TEM, TE and TM modes. As a result, this Ph. D. dissertation introduces the transversely periodic waveguide concept. Once the analogies between the modes of a transversely periodic waveguide and the ones of a closed waveguide have been established, a generalization of a well-known closed waveguide characterization method, the generalized Transverse Resonance Technique, is developed for the obtention of transversely periodic modes. At this point, all the necessary elements for the consideration of discontinuities between two different transversely periodic waveguides are at our disposal. The analysis of this type of discontinuities will be carried out by means of another well known closed waveguide method, the Mode Matching technique. This Ph. D. dissertation contains a sufficient number of examples, including the analysis of a wire-medium slab, a cross-shaped patches periodic surface and a parallel plate waveguide with a textured surface, that demonstrate that the Transverse Resonance Technique - Mode Matching hybrid is highly precise, efficient and versatile. Thus, the initial statement: ”periodic structures can be understood and analyzed by means of closed waveguide theories and techniques”, will be corroborated. Finally, this Ph. D. dissertation contains an adaptation of the aforementioned generalized Transverse Resonance Technique by means of which the analysis of laterally open periodic waveguides, such as the well known Substrate Integrated Waveguides, can be carried out without any approximation. The analysis of this type of structures has suscitated a lot of interest in the recent past and the previous analysis techniques proposed always resorted to some kind of fictitious wall to close the structure. vii Resumen El principal objetivo de esta tesis doctoral es demostrar que, bajo ciertas circunstancias que se cumplen para la gran mayoría de estructuras con interés práctico, las estructuras periódicas se pueden analizar y entender con conceptos y técnicas propias de las guías de onda cerradas. Para ello, en un primer lugar se considera una estructura cilíndrical transversalmente periódica y se estudia la ecuación de onda bajo una combinación de condiciones de contorno periódicas y de conductor perfecto. Este estudio teórico y de caracter general, sigue el análisis clásico de las guías de onda cerradas por conductor eléctrico perfecto. A la luz de los resultados queda claro que, bajo ciertas condiciones de periodicidad (muy comunes en la práctica) las estructuras cilíndricas transversalmente periódicas guardan multitud de analogías con las guías de onda cerradas. En particular, pueden ser descritas mediante un conjunto completo de modos TEM, TE y TM. Por ello, ésta tesis introduce el concepto de guía de onda transversalmente periódica. Una vez establecidas las similitudes entre las soluciones de la ecuación de onda, bajo una combinación de condiciones de contorno periódicas y de conductor perfecto, y los modos de guías de onda cerradas, se lleva a cabo, con éxito, la adaptación de un conocido método de caracterización de guías de onda cerradas, la técnica de la Resonancia Transversal Generalizada, para la obtención de los modos de guías transversalmente periódicas. En este punto, se tienen todos los elementos necesarios para considerar discontinuidades entre guías de onda transversalmente periódicas. El analisis de este tipo de discontinuidades se llevará a cabo mediante otro conocido método de análisis de estructuras cerradas, el Ajuste Modal. Esta tesis muestra multitud de ejemplos, como por ejemplo el análisis de un wire-medium slab, una superficie de parches con forma de cruz o una guía de placas paralelas donde una de dichas placas tiene cierta textura, en los que se demuestra que el método híbrido formado por la Resonancia Transversal Generalizada y el Ajuste Modal, es tremendamente preciso, eficiente y versátil y confirmará la validez de el enunciado inicial: ”las estructuras periódicas se pueden analizar y entender con conceptos y técnicas propias de las guías de onda cerradas” Para terminar, esta tésis doctoral incluye también una modificación de la técnica de la Resonancia Transversal Generalizada mediante la cual es posible abordar el análisis de estructuras periódica abiertas en los laterales, como por ejemplo las famosas guías de onda integradas en sustrato, sin ninguna aproximación. El análisis de este tipo de estructuras ha despertado mucho interés en los últimos años y las técnicas de análisis propuestas hasta ix el momento acostumbran a recurrir a algún tipo de pared ficticia para simular el carácter abierto de la estructura.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many cities in Europe have difficulties to meet the air quality standards set by the European legislation, most particularly the annual mean Limit Value for NO2. Road transport is often the main source of air pollution in urban areas and therefore, there is an increasing need to estimate current and future traffic emissions as accurately as possible. As a consequence, a number of specific emission models and emission factors databases have been developed recently. They present important methodological differences and may result in largely diverging emission figures and thus may lead to alternative policy recommendations. This study compares two approaches to estimate road traffic emissions in Madrid (Spain): the COmputer Programme to calculate Emissions from Road Transport (COPERT4 v.8.1) and the Handbook Emission Factors for Road Transport (HBEFA v.3.1), representative of the ‘average-speed’ and ‘traffic situation’ model types respectively. The input information (e.g. fleet composition, vehicle kilometres travelled, traffic intensity, road type, etc.) was provided by the traffic model developed by the Madrid City Council along with observations from field campaigns. Hourly emissions were computed for nearly 15 000 road segments distributed in 9 management areas covering the Madrid city and surroundings. Total annual NOX emissions predicted by HBEFA were a 21% higher than those of COPERT. The discrepancies for NO2 were lower (13%) since resulting average NO2/NOX ratios are lower for HBEFA. The larger differences are related to diesel vehicle emissions under “stop & go” traffic conditions, very common in distributor/secondary roads of the Madrid metropolitan area. In order to understand the representativeness of these results, the resulting emissions were integrated in an urban scale inventory used to drive mesoscale air quality simulations with the Community Multiscale Air Quality (CMAQ) modelling system (1 km2 resolution). Modelled NO2 concentrations were compared with observations through a series of statistics. Although there are no remarkable differences between both model runs, the results suggest that HBEFA may overestimate traffic emissions. However, the results are strongly influenced by methodological issues and limitations of the traffic model. This study was useful to provide a first alternative estimate to the official emission inventory in Madrid and to identify the main features of the traffic model that should be improved to support the application of an emission system based on “real world” emission factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the economic impact assessment of the construction of a new road on the regional distribution of jobs. The paper summarizes different existing model approaches considered to assess economic impacts through a literature review. Afterwards, we present the development of a comprehensive approach for analyzing the interaction of new transport infrastructure and the economic impact through an integrated model. This model has been applied to the construction of the motorway A-40 in Spain (497 Km.) which runs across three regions without passing though Madrid City. This may in turn lead to the relocation of labor and capital due to the improvement of accessibility of markets or inputs. The result suggests the existence of direct and indirect effects in other regions derived from the improvement of the transportation infrastructure, and confirms the relevance of road freight transport in some regions. We found that the changes in regional employment are substantial for some regions (increasing or decreasing jobs), but a t the same time negligible in other regions. As a result,the approach provides broad guidance to national governments and other transport-related parties about the impacts of this transport policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The crop simulation model AquaCrop, recently developed by FAO can be used for a wide range of purposes. However, in its present form, its use over large areas or for applications that require a large number of simulations runs (e.g., long-term analysis), is not practical without developing software to facilitate such applications. Two tools for managing the inputs and outputs of AquaCrop, named AquaData and AquaGIS, have been developed for this purpose and are presented here. Both software utilities have been programmed in Delphi v. 5 and in addition, AquaGIS requires the Geographic Information System (GIS) programming tool MapObjects. These utilities allow the efficient management of input and output files, along with a GIS module to develop spatial analysis and effect spatial visualization of the results, facilitating knowledge dissemination. A sample of application of the utilities is given here, as an AquaCrop simulation analysis of impact of climate change on wheat yield in Southern Spain, which requires extensive input data preparation and output processing. The use of AquaCrop without the two utilities would have required approximately 1000 h of work, while the utilization of AquaData and AquaGIS reduced that time by more than 99%. Furthermore, the use of GIS, made it possible to perform a spatial analysis of the results, thus providing a new option to extend the use of the AquaCrop model to scales requiring spatial and temporal analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the ever growing trend of smart phones and tablets, Android is becoming more and more popular everyday. With more than one billion active users i to date, Android is the leading technology in smart phone arena. In addition to that, Android also runs on Android TV, Android smart watches and cars. Therefore, in recent years, Android applications have become one of the major development sectors in software industry. As of mid 2013, the number of published applications on Google Play had exceeded one million and the cumulative number of downloads was more than 50 billionii. A 2013 survey also revealed that 71% of the mobile application developers work on developing Android applicationsiii. Considering this size of Android applications, it is quite evident that people rely on these applications on a daily basis for the completion of simple tasks like keeping track of weather to rather complex tasks like managing one’s bank accounts. Hence, like every other kind of code, Android code also needs to be verified in order to work properly and achieve a certain confidence level. Because of the gigantic size of the number of applications, it becomes really hard to manually test Android applications specially when it has to be verified for various versions of the OS and also, various device configurations such as different screen sizes and different hardware availability. Hence, recently there has been a lot of work on developing different testing methods for Android applications in Computer Science fraternity. The model of Android attracts researchers because of its open source nature. It makes the whole research model more streamlined when the code for both, application and the platform are readily available to analyze. And hence, there has been a great deal of research in testing and static analysis of Android applications. A great deal of this research has been focused on the input test generation for Android applications. Hence, there are a several testing tools available now, which focus on automatic generation of test cases for Android applications. These tools differ with one another on the basis of their strategies and heuristics used for this generation of test cases. But there is still very little work done on the comparison of these testing tools and the strategies they use. Recently, some research work has been carried outiv in this regard that compared the performance of various available tools with respect to their respective code coverage, fault detection, ability to work on multiple platforms and their ease of use. It was done, by running these tools on a total of 60 real world Android applications. The results of this research showed that although effective, these strategies being used by the tools, also face limitations and hence, have room for improvement. The purpose of this thesis is to extend this research into a more specific and attribute-­‐ oriented way. Attributes refer to the tasks that can be completed using the Android platform. It can be anything ranging from a basic system call for receiving an SMS to more complex tasks like sending the user to another application from the current one. The idea is to develop a benchmark for Android testing tools, which is based on the performance related to these attributes. This will allow the comparison of these tools with respect to these attributes. For example, if there is an application that plays some audio file, will the testing tool be able to generate a test input that will warrant the execution of this audio file? Using multiple applications using different attributes, it can be visualized that which testing tool is more useful for which kinds of attributes. In this thesis, it was decided that 9 attributes covering the basic nature of tasks, will be targeted for the assessment of three testing tools. Later this can be done for much more attributes to compare even more testing tools. The aim of this work is to show that this approach is effective and can be used on a much larger scale. One of the flagship features of this work, which also differentiates it with the previous work, is that the applications used, are all specially made for this research. The reason for doing that is to analyze just that specific attribute in isolation, which the application is focused on, and not allow the tool to get bottlenecked by something trivial, which is not the main attribute under testing. This means 9 applications, each focused on one specific attribute. The main contributions of this thesis are: A summary of the three existing testing tools and their respective techniques for automatic test input generation of Android Applications. • A detailed study of the usage of these testing tools using the 9 applications specially designed and developed for this study. • The analysis of the obtained results of the study carried out. And a comparison of the performance of the selected tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A numerical simulation of the aerodynamic behavior of high-speed trains under synthetic crosswinds at a 90º yaw angle is presented. The train geometry is the aerodynamic train model (ATM). Flow description based on numerical simulations is obtained using large eddy simulation (LES) and the commercial code ANSYSFluent V14.5. A crosswind whose averaged velocity and turbulence characteristics change with distance to the ground is imposed. Turbulent fluctuations that vary temporally and spatially are simulated with TurbSim code. The crosswind boundary condition is calculated for the distance the train runs during a simulation period. The inlet streamwise velocity boundary condition is generated using Taylor?s frozen turbulence hypothesis. The model gives a time history of the force and moments acting on the train; this includes averaged values, standard deviations and extreme values. Of particular interest are the spectra of the forces and moments, and the admittance spectra. For comparison, results obtained with LES and a uniform wind velocity fluctuating in time, and results obtained with Reynolds averaged Navier Stokes equations (RANS), and the averaged wind conditions, are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to assess the performance or multi-layer canopy parameterizations implemented in the mesoscale WRF model in order to understand their potential contribution to improve the description of energy fluxes and wind fields in the Madrid city. It was found that the Building Energy Model (BEP+BEM) parameterization yielded better results than the bulk standard scheme implemented in the Noah LSM, but very close to those of the Building Energy Parameterization (BEP). The later was deemed as the best option since data requirements and CPU time were smaller. Two annual runs were made to feed the CMAQ chemical-transport model to assess the impact of this feature in routinely air quality modelling activities.