916 resultados para automatic test case generation
Resumo:
As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.
Resumo:
Ohjelmistotestauksen avulla voidaan tarkastella sovelluksen vastaavuutta vaatimuksiin. Tavoitteena on löytää sovelluksesta virheitä, ja siten parantaa sovelluksen laatua. Sovelluksen laatu voidaan määritellä useilla mittareilla, kuten esimerkiksi testattavuudella. Tässä työssä tarkastellaan WWW-sovelluksen automatisoidun testauksen toteutusta, jossa käytetään apuna testauskehystä. Automatisoituun testaukseen kuuluu testitapausten suunnittelu sekä toteutus, joiden lopputuloksena on uudelleenajettavia testitapauksia. Testaus keskittyy sovelluksen toiminnallisuuteen ja jättää tietokantaan päivitettävien tietojen tarkastamisen tekemättä. Testaus suoritetaan ilman tarkempaa tietoa sovelluksen sisäisestä toiminnasta. Testattava sovellus on Mobilding-hankkeessa toteutettu WWW-sovellus, jonka avulla hallinnoidaan rakennuksen elementtejä. Työssä vertaillaan WWW-sovelluksen käyttöliittymän testaukseen soveltuvia testauskehyksiä, ja pyritään tuomaan esille niiden ominaispiirteitä. Työn tuloksena on uudelleensuoritettavia testitapahtumia. Lisäksi pohditaan ohjelmointikäytäntöjä, joilla voidaan edistää automatisoitua testausta. Ohjelmointikäytännöt perustuvat työn toteutuksen aikana havaittuihin ongelmiin.
Resumo:
Testaustapausten valitseminen on testauksessa tärkeää, koska kaikkia testaustapauksia ei voida testata aika- ja raharajoitteiden takia. Testaustapausten valintaan on paljon eri menetelmiä joista eniten esillä olevat ovat malleihin perustuva valinta, kombinaatiovalinta ja riskeihin perustuva valinta. Kaikkiin edellä mainittuihin menetelmiin testaustapaukset luodaan ohjelman spesifikaation perusteella. Malleihin perustuvassa menetelmässä käytetään hyväksi ohjelman toiminnasta olevia malleja, joista valitaan tärkeimmät testattavaksi. Kombinaatiotestauksessa testitapaukset on muodostettu ominaisuuspareina jolloin yhden parin testaamisesta päätellään kahden ominaisuuden toiminta. Kombinaatiotestaus on tehokas löytämään virheitä, jotka johtuvat yhdestä tai kahdesta tekijästä. Riskeihin perustuva testaus pyrkii arvioimaan ohjelman riskejä ja valitsemaan testitapaukset niiden perusteella. Kaikissa menetelmissä priorisointi on tärkeässä roolissa, jotta testauksesta saadaan riittävä luotettavuus ilman kustannusten nousua.
Resumo:
Percarboxylic acids are commonly used as disinfection and bleaching agents in textile, paper, and fine chemical industries. All of these applications are based on the oxidative potential of these compounds. In spite of high interest in these chemicals, they are unstable and explosive chemicals, which increase the risk of synthesis processes and transportation. Therefore, the safety criteria in the production process should be considered. Microreactors represent a technology that efficiently utilizes safety advantages resulting from small scale. Therefore, microreactor technology was used in the synthesis of peracetic acid and performic acid. These percarboxylic acids were produced at different temperatures, residence times and catalyst i.e. sulfuric acid concentrations. Both synthesis reactions seemed to be rather fast because with performic acid equilibrium was reached in 4 min at 313 K and with peracetic acid in 10 min at 343 K. In addition, the experimental results were used to study the kinetics of the formation of performic acid and peracetic acid. The advantages of the microreactors in this study were the efficient temperature control even in very exothermic reaction and good mixing due to the short diffusion distances. Therefore, reaction rates were determined with high accuracy. Three different models were considered in order to estimate the kinetic parameters such as reaction rate constants and activation energies. From these three models, the laminar flow model with radial velocity distribution gave most precise parameters. However, sulfuric acid creates many drawbacks in this synthesis process. Therefore, a ´´greener´´ way to use heterogeneous catalyst in the synthesis of performic acid in microreactor was studied. The cation exchange resin, Dowex 50 Wx8, presented very high activity and a long life time in this reaction. In the presence of this catalyst, the equilibrium was reached in 120 second at 313 K which indicates a rather fast reaction. In addition, the safety advantages of microreactors were investigated in this study. Four different conventional methods were used. Production of peracetic acid was used as a test case, and the safety of one conventional batch process was compared with an on-site continuous microprocess. It was found that the conventional methods for the analysis of process safety might not be reliable and adequate for radically novel technology, such as microreactors. This is understandable because the conventional methods are partly based on experience, which is very limited in connection with totally novel technology. Therefore, one checklist-based method was developed to study the safety of intensified and novel processes at the early stage of process development. The checklist was formulated using the concept of layers of protection for a chemical process. The traditional and three intensified processes of hydrogen peroxide synthesis were selected as test cases. With these real cases, it was shown that several positive and negative effects on safety can be detected in process intensification. The general claim that safety is always improved by process intensification was questioned.
Resumo:
This paper concerns the development of drives that use electromechanical rotative motor systems. It is proposed an experimental drive test structure integrated to simulation softwares. The objective of this work is to show that an affordable model validation procedure can be obtained by combining a precision data acquisition with well tuned state-of-the-art simulation packages. This is required for fitting, in the best way, a drive to its load or, inversely, to adapt loads to given drive characteristics.
Resumo:
Wind energy has obtained outstanding expectations due to risks of global warming and nuclear energy production plant accidents. Nowadays, wind farms are often constructed in areas of complex terrain. A potential wind farm location must have the site thoroughly surveyed and the wind climatology analyzed before installing any hardware. Therefore, modeling of Atmospheric Boundary Layer (ABL) flows over complex terrains containing, e.g. hills, forest, and lakes is of great interest in wind energy applications, as it can help in locating and optimizing the wind farms. Numerical modeling of wind flows using Computational Fluid Dynamics (CFD) has become a popular technique during the last few decades. Due to the inherent flow variability and large-scale unsteadiness typical in ABL flows in general and especially over complex terrains, the flow can be difficult to be predicted accurately enough by using the Reynolds-Averaged Navier-Stokes equations (RANS). Large- Eddy Simulation (LES) resolves the largest and thus most important turbulent eddies and models only the small-scale motions which are more universal than the large eddies and thus easier to model. Therefore, LES is expected to be more suitable for this kind of simulations although it is computationally more expensive than the RANS approach. With the fast development of computers and open-source CFD software during the recent years, the application of LES toward atmospheric flow is becoming increasingly common nowadays. The aim of the work is to simulate atmospheric flows over realistic and complex terrains by means of LES. Evaluation of potential in-land wind park locations will be the main application for these simulations. Development of the LES methodology to simulate the atmospheric flows over realistic terrains is reported in the thesis. The work also aims at validating the LES methodology at a real scale. In the thesis, LES are carried out for flow problems ranging from basic channel flows to real atmospheric flows over one of the most recent real-life complex terrain problems, the Bolund hill. All the simulations reported in the thesis are carried out using a new OpenFOAM® -based LES solver. The solver uses the 4th order time-accurate Runge-Kutta scheme and a fractional step method. Moreover, development of the LES methodology includes special attention to two boundary conditions: the upstream (inflow) and wall boundary conditions. The upstream boundary condition is generated by using the so-called recycling technique, in which the instantaneous flow properties are sampled on aplane downstream of the inlet and mapped back to the inlet at each time step. This technique develops the upstream boundary-layer flow together with the inflow turbulence without using any precursor simulation and thus within a single computational domain. The roughness of the terrain surface is modeled by implementing a new wall function into OpenFOAM® during the thesis work. Both, the recycling method and the newly implemented wall function, are validated for the channel flows at relatively high Reynolds number before applying them to the atmospheric flow applications. After validating the LES model over simple flows, the simulations are carried out for atmospheric boundary-layer flows over two types of hills: first, two-dimensional wind-tunnel hill profiles and second, the Bolund hill located in Roskilde Fjord, Denmark. For the twodimensional wind-tunnel hills, the study focuses on the overall flow behavior as a function of the hill slope. Moreover, the simulations are repeated using another wall function suitable for smooth surfaces, which already existed in OpenFOAM® , in order to study the sensitivity of the flow to the surface roughness in ABL flows. The simulated results obtained using the two wall functions are compared against the wind-tunnel measurements. It is shown that LES using the implemented wall function produces overall satisfactory results on the turbulent flow over the two-dimensional hills. The prediction of the flow separation and reattachment-length for the steeper hill is closer to the measurements than the other numerical studies reported in the past for the same hill geometry. The field measurement campaign performed over the Bolund hill provides the most recent field-experiment dataset for the mean flow and the turbulence properties. A number of research groups have simulated the wind flows over the Bolund hill. Due to the challenging features of the hill such as the almost vertical hill slope, it is considered as an ideal experimental test case for validating micro-scale CFD models for wind energy applications. In this work, the simulated results obtained for two wind directions are compared against the field measurements. It is shown that the present LES can reproduce the complex turbulent wind flow structures over a complicated terrain such as the Bolund hill. Especially, the present LES results show the best prediction of the turbulent kinetic energy with an average error of 24.1%, which is a 43% smaller than any other model results reported in the past for the Bolund case. Finally, the validated LES methodology is demonstrated to simulate the wind flow over the existing Muukko wind farm located in South-Eastern Finland. The simulation is carried out only for one wind direction and the results on the instantaneous and time-averaged wind speeds are briefly reported. The demonstration case is followed by discussions on the practical aspects of LES for the wind resource assessment over a realistic inland wind farm.
Resumo:
A new method for sampling the exact (within the nodal error) ground state distribution and nondiflPerential properties of multielectron systems is developed and applied to firstrow atoms. Calculated properties are the distribution moments and the electronic density at the nucleus (the 6 operator). For this purpose, new simple trial functions are developed and optimized. First, using Hydrogen as a test case, we demonstrate the accuracy of our algorithm and its sensitivity to error in the trial function. Applications to first row atoms are then described. We obtain results which are more satisfactory than the ones obtained previously using Monte Carlo methods, despite the relative crudeness of our trial functions. Also, a comparison is made with results of highly accurate post-Hartree Fock calculations, thereby illuminating the nodal error in our estimates. Taking into account the CPU time spent, our results, particularly for the 8 operator, have a relatively large variance. Several ways of improving the eflSciency together with some extensions of the algorithm are suggested.
Resumo:
One of the most challenging tasks for a synthetic organic chemist today, is the development of chemo, regio, and stereoselective methodologies toward the total synthesis of macromolecules. r . The objective of my thesis was to develop methodologies towards this end. The first part of my project was to develop highly functionalized chirons from D-glucose, a cheap, chiral starting material, to be utilized in this capacity. The second part of the project dealt with modifying the carbon-carbon bond forming Suzuki reaction, which is utilized quite often as a means of combining molecular sub units in total synthesis applications. As previously stated the first area of the project was to develop high value chirons from D-glucose, but the mechanism of their formation was also investigated. The free radical initiated oxidative fragmentation of benzylidene acetals was investigated through the use of several test-case substrates in order to unravel the possible mechanistic pathways. This was performed by reacting the different acetals with N-bromosuccinimide and benzoyl peroxide in chlorobenzene at 70^C in all cases. Of the three mechanistic pathways discussed in the literature, it was determined, from the various reaction products obtained, that the fragmentation of the initial benzylic radical does not occur spontaneously but rather, oxidation proceeds to give the benzyl bromide, which then fragments via a polar pathway. It was also discovered that the regioselectivity of the fragmentation step could be altered through incorporation of an allylic system into the benzylidene acetal. This allows for the acquisition of a new set of densely functionalized. chiral, valuable synthetic intermediates in only a few steps and in high yields from a-Dglucose. The second part of the project was the utilization of the phosphonium salt room temperature ionic liquid tetradecyltrihexylphosphonium chloride (THPC) as an efficient reusable medium for the palladium catalyzed Suzuki cross-coupling reaction of aryl halides, including aryl chlorides, under mild conditions. The cross-coupling reactions were found to proceed in THPC containing small amounts of water and toluene using potassium phosphate and 1% Pd2(dba)3. Variously substituted iodobenzenes, including electron rich derivatives, reacted efficiently in THPC with a variety of arylboronic acids and afforded complete conversion within 1 hour at 50 ^C. The corresponding aryl bromides also reacted under these conditions with the addition of a catalytic amount of triphenylphosphine that allowed for complete conversion and high isolated yields. The reactions involving aryl chlorides were considerably slower, although the addition of triphenylphosphine and heating at 70 ^C allowed high conversion of electron deficient derivatives. Addition of water and hexane to the reaction products results in a triphasic system in which the top hexane phase contained the biaryl products, the palladium catalyst remained fully dissolved in the central THPC layer, while the inorganic salts were extracted into the lower aqueous phase. The catalyst was then recycled by removing the top and bottom layers and adding the reagents to the ionic liquid which was heated again at 50 ^C; resulting in complete turnover of iodobenzene. Repetition of this procedure gave the biphenyl product in 82-97% yield (repeated five times) for both the initial and recycled reaction sequences.
Resumo:
Cette thèse examine les impacts sur la morphologie des tributaires du fleuve Saint-Laurent des changements dans leur débit et leur niveau de base engendrés par les changements climatiques prévus pour la période 2010–2099. Les tributaires sélectionnés (rivières Batiscan, Richelieu, Saint-Maurice, Saint-François et Yamachiche) ont été choisis en raison de leurs différences de taille, de débit et de contexte morphologique. Non seulement ces tributaires subissent-ils un régime hydrologique modifié en raison des changements climatiques, mais leur niveau de base (niveau d’eau du fleuve Saint-Laurent) sera aussi affecté. Le modèle morphodynamique en une dimension (1D) SEDROUT, à l’origine développé pour des rivières graveleuses en mode d’aggradation, a été adapté pour le contexte spécifique des tributaires des basses-terres du Saint-Laurent afin de simuler des rivières sablonneuses avec un débit quotidien variable et des fluctuations du niveau d’eau à l’aval. Un module pour simuler le partage des sédiments autour d’îles a aussi été ajouté au modèle. Le modèle ainsi amélioré (SEDROUT4-M), qui a été testé à l’aide de simulations à petite échelle et avec les conditions actuelles d’écoulement et de transport de sédiments dans quatre tributaires du fleuve Saint-Laurent, peut maintenant simuler une gamme de problèmes morphodynamiques de rivières. Les changements d’élévation du lit et d’apport en sédiments au fleuve Saint-Laurent pour la période 2010–2099 ont été simulés avec SEDROUT4-M pour les rivières Batiscan, Richelieu et Saint-François pour toutes les combinaisons de sept régimes hydrologiques (conditions actuelles et celles prédites par trois modèles de climat globaux (MCG) et deux scénarios de gaz à effet de serre) et de trois scénarios de changements du niveau de base du fleuve Saint-Laurent (aucun changement, baisse graduelle, baisse abrupte). Les impacts sur l’apport de sédiments et l’élévation du lit diffèrent entre les MCG et semblent reliés au statut des cours d’eau (selon qu’ils soient en état d’aggradation, de dégradation ou d’équilibre), ce qui illustre l’importance d’examiner plusieurs rivières avec différents modèles climatiques afin d’établir des tendances dans les effets des changements climatiques. Malgré le fait que le débit journalier moyen et le débit annuel moyen demeurent près de leur valeur actuelle dans les trois scénarios de MCG, des changements importants dans les taux de transport de sédiments simulés pour chaque tributaire sont observés. Ceci est dû à l’impact important de fortes crues plus fréquentes dans un climat futur de même qu’à l’arrivée plus hâtive de la crue printanière, ce qui résulte en une variabilité accrue dans les taux de transport en charge de fond. Certaines complications avec l’approche de modélisation en 1D pour représenter la géométrie complexe des rivières Saint-Maurice et Saint-François suggèrent qu’une approche bi-dimensionnelle (2D) devrait être sérieusement considérée afin de simuler de façon plus exacte la répartition des débits aux bifurcations autour des îles. La rivière Saint-François est utilisée comme étude de cas pour le modèle 2D H2D2, qui performe bien d’un point de vue hydraulique, mais qui requiert des ajustements pour être en mesure de pleinement simuler les ajustements morphologiques des cours d’eau.
Resumo:
L’utilisation d’une méthode d’assimilation de données, associée à un modèle de convection anélastique, nous permet la reconstruction des structures physiques d’une partie de la zone convective située en dessous d’une région solaire active. Les résultats obtenus nous informent sur les processus d’émergence des tubes de champ magnétique au travers de la zone convective ainsi que sur les mécanismes de formation des régions actives. Les données solaires utilisées proviennent de l’instrument MDI à bord de l’observatoire spatial SOHO et concernent principalement la région active AR9077 lors de l’ ́évènement du “jour de la Bastille”, le 14 juillet 2000. Cet évènement a conduit à l’avènement d’une éruption solaire, suivie par une importante éjection de masse coronale. Les données assimilées (magnétogrammes, cartes de températures et de vitesses verticales) couvrent une surface de 175 méga-mètres de coté acquises au niveau photosphérique. La méthode d’assimilation de données employée est le “coup de coude direct et rétrograde”, une méthode de relaxation Newtonienne similaire à la méthode “quasi-linéaire inverse 3D”. Elle présente l’originalité de ne pas nécessiter le calcul des équations adjointes au modèle physique. Aussi, la simplicité de la méthode est un avantage numérique conséquent. Notre étude montre au travers d’un test simple l’applicabilité de cette méthode à un modèle de convection utilisé dans le cadre de l’approximation anélastique. Nous montrons ainsi l’efficacité de cette méthode et révélons son potentiel pour l’assimilation de données solaires. Afin d’assurer l’unicité mathématique de la solution obtenue nous imposons une régularisation dans tout le domaine simulé. Nous montrons enfin que l’intérêt de la méthode employée ne se limite pas à la reconstruction des structures convectives, mais qu’elle permet également l’interpolation optimale des magnétogrammes photosphériques, voir même la prédiction de leur évolution temporelle.
Resumo:
Methods are developed for predicting vibration response characteristics of systems which change configuration during operation. A cartesian robot, an example of such a position-dependent system, served as a test case for these methods and was studied in detail. The chosen system model was formulated using the technique of Component Mode Synthesis (CMS). The model assumes that he system is slowly varying, and connects the carriages to each other and to the robot structure at the slowly varying connection points. The modal data required for each component is obtained experimentally in order to get a realistic model. The analysis results in prediction of vibrations that are produced by the inertia forces as well as gravity and friction forces which arise when the robot carriages move with some prescribed motion. Computer simulations and experimental determinations are conducted in order to calculate the vibrations at the robot end-effector. Comparisons are shown to validate the model in two ways: for fixed configuration the mode shapes and natural frequencies are examined, and then for changing configuration the residual vibration at the end of the mode is evaluated. A preliminary study was done on a geometrically nonlinear system which also has position-dependency. The system consisted of a flexible four-bar linkage with elastic input and output shafts. The behavior of the rocker-beam is analyzed for different boundary conditions to show how some limiting cases are obtained. A dimensional analysis leads to an evaluation of the consequences of dynamic similarity on the resulting vibration.
Resumo:
Rusia sufrió grandes cambios tras la desintegración de la URSS en 1991. No obstante, con la llegada de Vladimir Putin al poder, los intereses geoestratégicos de Rusia sobre el espacio postsoviético revivieron con nuevo ímpetu debido a una mayor cantidad de recursos a disposición del Estado. La República de Moldavia es un claro ejemplo del resurgir de la política exterior rusa hacia el espacio postsoviético, siendo incluso, una región clave en la lucha de la Federación Rusa por recuperar su zona de influencia.
Resumo:
The fact that the hybrid building is an extremely condensed urban block which increases the city’s density and contributes to the public realm of the city – horizontally as well vertically - forms one of the key interests of this documentation, research and master studio work. The “ground scraper” is not only public because of the character of its plinth facing surrounding streets, but also in regard to its interior space that is partly accessible to public. As such the European hybrid building potentially extends the city’s public domain horizontally and vertically into the building’s interior and links the public domain inside and outside. Notwithstanding, the hybrid building due to its specific and unconventional character represents a truly urban architecture that was unfortunately often rejected in the name of ‘purity’ of form and function during the twentieth century. Or with other words, its rejection demonstrates the domination of the building’s plan opposed to the section. Today, new frameworks for the city, like the “compact city,” ask for innovative interpretations and designs of building types, worthy to be investigated and proposed. The architectural type of the hybrid building, (re)defines and expresses the relation between architecture and the city in a specific manner. To begin with, the city of Rotterdam forms the first test-case of the Hybrid’s project to document and discuss statements, such as “the hybrid building has a long- standing tradition within this ‘modern city”, “it is a machine for urbanity,” “it enlarges the city,” “it innovates because of its ambitiousness but also because of necessity,” “it combines to activate,” “it asks for extraordinary design intelligence and craftsmanship.” A special way of drawing is developed to document, analyse and compare historical and contemporary representatives of the species. The method includes panoply of scales ranging from the morphological arrangement on the scale of the city, the typologies of stacking diverse programs to the architectural features that establish the mutual relationship between the public space of the city and the interior of the building. Basically the features analysed within the series of drawings are also constitutional for (the success of) every future hybrid building.
Resumo:
This paper reflects on the challenges facing the effective implementation of the new EU fundamental rights architecture that emerged from the Lisbon Treaty. Particular attention is paid to the role of the Court of Justice of the European Union (CJEU) and its ability to function as a ‘fundamental rights tribunal’. The paper first analyses the praxis of the European Court of Human Rights in Strasbourg and its long-standing experience in overseeing the practical implementation of the European Convention for the Protection of Human Rights and Fundamental Freedoms. Against this analysis, it then examines the readiness of the CJEU to live up to its consolidated and strengthened mandate on fundamental rights as one of the prime guarantors of the effective implementation of the EU Charter of Fundamental Rights. We specifically review the role of ‘third-party interventions’ by non-governmental organisations, international and regional human rights actors as well as ‘interim relief measures’ when ensuring effective judicial protection of vulnerable individuals in cases of alleged violations of fundamental human rights. To flesh out our arguments, we rely on examples within the scope of the relatively new and complex domain of EU legislation, the Area of Freedom, Security and Justice (AFSJ), and its immigration, external border and asylum policies. In view of the fundamental rights-sensitive nature of these domains, which often encounter shifts of accountability and responsibility in their practical application, and the Lisbon Treaty’s expansion of the jurisdiction of the CJEU to interpret and review EU AFSJ legislation, this area can be seen as an excellent test case for the analyses at hand. The final section puts forth a set of policy suggestions that can assist the CJEU in the process of adjusting itself to the new fundamental rights context in a post-Lisbon Treaty setting.
Resumo:
There is a pressing need for good rainfall data for the African continent both for humanitarian and climatological purposes. Given the sparseness of ground-based observations, one source of rainfall information is Numerical Weather Prediction (NWP) model outputs. The aim of this article is to investigate the quality of two NWP products using Ethiopia as a test case. The two products evaluated are the ERA-40 and NCEP reanalysis rainfall products. Spatial, seasonal and interannual variability of rainfall have been evaluated for Kiremt (JJAS) and Belg (FMAM) seasons at a spatial scale that reflects the local variability of the rainfall climate using a method which makes optimum use of sparse gauge validation data. We found that the spatial pattern of the rainfall climatology is captured well by both models especially for the main rainy season Kiremt. However, both models tend to overestimate the mean rainfall in the northwest, west and central regions but underestimate in the south and east. The overestimation is greater for NCEP in Belg season and greater for ERA-40 in Kiremt Season. ERA-40 captures the annual cycle over most of the country better than NCEP, but strongly exaggerates the Kiremt peak in the northwest and west. The overestimation in Kiremt appears to have been reduced since the assimilation of satellite data increased around 1990. For both models the interannual variability is less well captured than the spatial and seasonal variability. Copyright © 2008 Royal Meteorological Society