870 resultados para Integrated circuits Ultra large scale integration
Resumo:
The predictability of high impact weather events on multiple time scales is a crucial issue both in scientific and socio-economic terms. In this study, a statistical-dynamical downscaling (SDD) approach is applied to an ensemble of decadal hindcasts obtained with the Max-Planck-Institute Earth System Model (MPI-ESM) to estimate the decadal predictability of peak wind speeds (as a proxy for gusts) over Europe. Yearly initialized decadal ensemble simulations with ten members are investigated for the period 1979–2005. The SDD approach is trained with COSMO-CLM regional climate model simulations and ERA-Interim reanalysis data and applied to the MPI-ESM hindcasts. The simulations for the period 1990–1993, which was characterized by several windstorm clusters, are analyzed in detail. The anomalies of the 95 % peak wind quantile of the MPI-ESM hindcasts are in line with the positive anomalies in reanalysis data for this period. To evaluate both the skill of the decadal predictability system and the added value of the downscaling approach, quantile verification skill scores are calculated for both the MPI-ESM large-scale wind speeds and the SDD simulated regional peak winds. Skill scores are predominantly positive for the decadal predictability system, with the highest values for short lead times and for (peak) wind speeds equal or above the 75 % quantile. This provides evidence that the analyzed hindcasts and the downscaling technique are suitable for estimating wind and peak wind speeds over Central Europe on decadal time scales. The skill scores for SDD simulated peak winds are slightly lower than those for large-scale wind speeds. This behavior can be largely attributed to the fact that peak winds are a proxy for gusts, and thus have a higher variability than wind speeds. The introduced cost-efficient downscaling technique has the advantage of estimating not only wind speeds but also estimates peak winds (a proxy for gusts) and can be easily applied to large ensemble datasets like operational decadal prediction systems.
Resumo:
A generalization of Arakawa and Schubert's convective quasi-equilibrium principle is presented for a closure formulation of mass-flux convection parameterization. The original principle is based on the budget of the cloud work function. This principle is generalized by considering the budget for a vertical integral of an arbitrary convection-related quantity. The closure formulation includes Arakawa and Schubert's quasi-equilibrium, as well as both CAPE and moisture closures as special cases. The formulation also includes new possibilities for considering vertical integrals that are dependent on convective-scale variables, such as the moisture within convection. The generalized convective quasi-equilibrium is defined by a balance between large-scale forcing and convective response for a given vertically-integrated quantity. The latter takes the form of a convolution of a kernel matrix and a mass-flux spectrum, as in the original convective quasi-equilibrium. The kernel reduces to a scalar when either a bulk formulation is adopted, or only large-scale variables are considered within the vertical integral. Various physical implications of the generalized closure are discussed. These include the possibility that precipitation might be considered as a potentially-significant contribution to the large-scale forcing. Two dicta are proposed as guiding physical principles for the specifying a suitable vertically-integrated quantity.
Resumo:
State-of-the-art regional climate model simulations that are able to resolve key mesoscale circulations are used, for the first time, to understand the interaction between the large-scale convective environment of the MJO and processes governing the strong diurnal cycle over the islands of the Maritime Continent (MC). Convection is sustained in the late afternoon just inland of the coasts due to sea breeze convergence. Previous work has shown that the variability in MC rainfall associated with the MJO is manifested in changes to this diurnal cycle; land-based rainfall peaks before the active convective envelope of the MJO reaches the MC, whereas oceanic rainfall rates peak whilst the active envelope resides over the region. The model simulations show that the main controls on oceanic MC rainfall in the early active MJO phases are the large-scale environment and atmospheric stability, followed by high oceanic latent heat flux forced by high near-surface winds in the later active MJO phases. Over land, rainfall peaks before the main convective envelope arrives (in agreement with observations), even though the large-scale convective environment is only moderately favourable for convection. The causes of this early rainfall peak are convective triggers from land-sea breeze circulations that are strong due to high surface insolation and surface heating. During the peak MJO phases cloud cover increases and surface insolation decreases, which weakens the strength of the mesoscale circulations and reduces land-based rainfall, even though the large-scale environment remains favourable for convection at this time. Hence, scale interactions are an essential part of the MJO transition across the MC.
Resumo:
Floods are the most frequent of natural disasters, affecting millions of people across the globe every year. The anticipation and forecasting of floods at the global scale is crucial to preparing for severe events and providing early awareness where local flood models and warning services may not exist. As numerical weather prediction models continue to improve, operational centres are increasingly using the meteorological output from these to drive hydrological models, creating hydrometeorological systems capable of forecasting river flow and flood events at much longer lead times than has previously been possible. Furthermore, developments in, for example, modelling capabilities, data and resources in recent years have made it possible to produce global scale flood forecasting systems. In this paper, the current state of operational large scale flood forecasting is discussed, including probabilistic forecasting of floods using ensemble prediction systems. Six state-of-the-art operational large scale flood forecasting systems are reviewed, describing similarities and differences in their approaches to forecasting floods at the global and continental scale. Currently, operational systems have the capability to produce coarse-scale discharge forecasts in the medium-range and disseminate forecasts and, in some cases, early warning products, in real time across the globe, in support of national forecasting capabilities. With improvements in seasonal weather forecasting, future advances may include more seamless hydrological forecasting at the global scale, alongside a move towards multi-model forecasts and grand ensemble techniques, responding to the requirement of developing multi-hazard early warning systems for disaster risk reduction.
Resumo:
Seasonal forecast skill of the basinwide and regional tropical cyclone (TC) activity in an experimental coupled prediction system based on the ECMWF System 4 is assessed. As part of a collaboration between the Center for Ocean–Land–Atmosphere Studies (COLA) and the ECMWF called Project Minerva, the system is integrated at the atmospheric horizontal spectral resolutions of T319, T639, and T1279. Seven-month hindcasts starting from 1 May for the years 1980–2011 are produced at all three resolutions with at least 15 ensemble members. The Minerva system demonstrates statistically significant skill for retrospective forecasts of TC frequency and accumulated cyclone energy (ACE) in the North Atlantic (NA), eastern North Pacific (EP), and western North Pacific. While the highest scores overall are achieved in the North Pacific, the skill in the NA appears to be limited by an overly strong influence of the tropical Pacific variability. Higher model resolution improves skill scores for the ACE and, to a lesser extent, the TC frequency, even though the influence of large-scale climate variations on these TC activity measures is largely independent of resolution changes. The biggest gain occurs in transition from T319 to T639. Significant skill in regional TC forecasts is achieved over broad areas of the Northern Hemisphere. The highest-resolution hindcasts exhibit additional locations with skill in the NA and EP, including land-adjacent areas. The feasibility of regional intensity forecasts is assessed. In the presence of the coupled model biases, the benefits of high resolution for seasonal TC forecasting may be underestimated.
Resumo:
Projected impacts of climate change on the populations and distributions of species pose a challenge for conservationists. In response, a number of adaptation strategies to enable species to persist in a changing climate have been proposed. Management to maximise the quality of habitat at existing sites may reduce the magnitude or frequency of climate-driven population declines. In addition large-scale management of landscapes could potentially improve the resilience of populations by facilitating inter-population movements. A reduction in the obstacles to species’ range expansion, may also allow species to track changing conditions better through shifts to new locations, either regionally or locally. However, despite a strong theoretical base, there is limited empirical evidence to support these management interventions. This makes it difficult for conservationists to decide on the most appropriate strategy for different circumstances. Here extensive data from long-term monitoring of woodland birds at individual sites are used to examine the two-way interactions between habitat and both weather and population count in the previous year. This tests the extent to which site-scale and landscape-scale habitat attributes may buffer populations against variation in winter weather (a key driver of woodland bird population size) and facilitate subsequent population growth. Our results provide some support for the prediction that landscape-scale attributes (patch isolation and area of woodland habitat) may influence the ability of some woodland bird species to withstand weather-mediated population declines. These effects were most apparent among generalist woodland species. There was also evidence that several, primarily specialist, woodland species are more likely to increase following population decline where there is more woodland at both site and landscape scales. These results provide empirical support for the concept that landscape-scale conservation efforts may make the populations of some woodland bird species more resilient to climate change. However in isolation, management is unlikely to provide a universal benefit to all species.
Resumo:
Decadal predictions on timescales from one year to one decade are gaining importance since this time frame falls within the planning horizon of politics, economy and society. The present study examines the decadal predictability of regional wind speed and wind energy potentials in three generations of the MiKlip (‘Mittelfristige Klimaprognosen’) decadal prediction system. The system is based on the global Max-Planck-Institute Earth System Model (MPI-ESM), and the three generations differ primarily in the ocean initialisation. Ensembles of uninitialised historical and yearly initialised hindcast experiments are used to assess the forecast skill for 10 m wind speeds and wind energy output (Eout) over Central Europe with lead times from one year to one decade. With this aim, a statistical-dynamical downscaling (SDD) approach is used for the regionalisation. Its added value is evaluated by comparison of skill scores for MPI-ESM large-scale wind speeds and SDD-simulated regional wind speeds. All three MPI-ESM ensemble generations show some forecast skill for annual mean wind speed and Eout over Central Europe on yearly and multi-yearly time scales. This forecast skill is mostly limited to the first years after initialisation. Differences between the three ensemble generations are generally small. The regionalisation preserves and sometimes increases the forecast skills of the global runs but results depend on lead time and ensemble generation. Moreover, regionalisation often improves the ensemble spread. Seasonal Eout skills are generally lower than for annual means. Skill scores are lowest during summer and persist longest in autumn. A large-scale westerly weather type with strong pressure gradients over Central Europe is identified as potential source of the skill for wind energy potentials, showing a similar forecast skill and a high correlation with Eout anomalies. These results are promising towards the establishment of a decadal prediction system for wind energy applications over Central Europe.
Resumo:
The gene SNRNP200 is composed of 45 exons and encodes a protein essential for pre-mRNA splicing, the 200 kDa helicase hBrr2. Two mutations in SNRNP200 have recently been associated with autosomal dominant retinitis pigmentosa (adRP), a retinal degenerative disease, in two families from China. In this work we analyzed the entire 35-Kb SNRNP200 genomic region in a cohort of 96 unrelated North American patients with adRP. To complete this large-scale sequencing project, we performed ultra high-throughput sequencing of pooled, untagged PCR products. We then validated the detected DNA changes by Sanger sequencing of individual samples from this cohort and from an additional one of 95 patients. One of the two previously known mutations (p.S1087L) was identified in 3 patients, while 4 new missense changes (p.R681C, p.R681H, p.V683L, p.Y689C) affecting highly conserved codons were identified in 6 unrelated individuals, indicating that the prevalence of SNRNP200-associated adRP is relatively high. We also took advantage of this research to evaluate the pool-and-sequence method, especially with respect to the generation of false positive and negative results. We conclude that, although this strategy can be adopted for rapid discovery of new disease-associated variants, it still requires extensive validation to be used in routine DNA screenings. (C) 2011 Wiley-Liss, Inc.
Resumo:
We present the results of searches for dipolar-type anisotropies in different energy ranges above 2.5 x 10(17) eV with the surface detector array of the Pierre Auger Observatory, reporting on both the phase and the amplitude measurements of the first harmonic modulation in the right-ascension distribution. Upper limits on the amplitudes are obtained, which provide the most stringent bounds at present, being below 2% at 99% C.L. for EeV energies. We also compare our results to those of previous experiments as well as with some theoretical expectations. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Este estudo objetiva uma contextualização, no cenário global, do papel representado por quatro grandes países emergentes: Brasil, Rússia, índia China. Ainda que haja muito se discutir respeito da ligação entre estas economias quanto guarda de parceria e quanto de concorrência fato que, recentemente, elas passaram ser tratadas como participantes de um mesmo fenômeno; constituem epicentro de uma mudança que poderá alterar quadro mundial de forças em um futuro não muito distante. conceito de "competitividade das nações" norteia estudo. análise do desempenho de cada país tem por substrato definição de prosperidade como integração multifatorial em grande escala, envolvendo economia, política, sociedade, meio-ambiente, instituições públicas, performance da classe empresária, capacidade de inovação dos meios científicos cultura. trabalho revelou, especificamente com relação estrutura econômica brasileira, sérias assimetrias que limitam as possibilidades de crescimento do país. Percebe-se que os fundamentos sócio-econômicos políticos apresentam lacunas importantes, que deixam descoberto setores como qualificação profissional, gestão pública e a regulação das atividades produtivas. Não obstante, foi possível ao Brasil alcançar altos níveis de desenvolvimento em várias áreas, partir da atuação de agentes privados que conseguem fazer frente um ambiente de negócios temerário. Neste ponto formaliza-se uma dicotomia entre papel tíbio ou ineficaz dos entes públicos, que não conseguem qualificar gestão institucional, uma notável capacidade empresarial para desenvolver processos sofisticados inovadores gerar resultados positivos. Faz-se urgente reparo desse desequilíbrio, com vistas participação eqüitativa de todos os setores da sociedade no desenvolvimento do potencial competitivo do país.
Resumo:
Este trabalho tem por objetivo revelar o inter-relacionamento envolvendo os principais determinantes da capacidade de inovação. Até hoje inexistem estudos empíricos que apresentam um modelo abrangente sobre uma base de dados em larga escala mostrando as principais rotas para o desenvolvimento da capacidade de inovação. A partir de uma revisão abrangente e sistemática da literatura, construiu-se um modelo teórico integrando os principais fatores sugeridos em artigos teóricos e de revisão. Desenvolveu-se uma escala de medição confiável e testou-se o modelo empiricamente em uma amostra de 243 firmas brasileiras de vários ramos empresariais. Os métodos utilizados incluem a análise fatorial confirmatória, equações estruturais e a análise multigrupo da invariância estrutural. Os resultados evidenciaram o efeito direto do conhecimento do cliente e do mercado, assim como da gestão estratégica da tecnologia sobre o desempenho em inovação. Ambos os fatores são afetados pela intenção estratégica de inovar e pela liderança transformadora, por meio da gestão de pessoas para inovação. A organicidade da estrutura organizacional e a gestão de projetos têm efeito moderador positivo sobre a relação entre a gestão estratégica da tecnologia e o desempenho em inovação. Esse efeito moderador, entretanto, não se revelou na relação entre o conhecimento do cliente e do mercado e o desempenho em inovação, o que foi explicado post-hoc. O valor deste estudo está na apresentação de uma escala de medição confiável e de um modelo teórico validado empiricamente que integra várias correntes de estudo e explicita como a ação da liderança pode estruturar e alavancar os recursos gerenciais para gerar e manter a capacidade da organização de inovar. Tal modelo se constitui em um road-map que pode ajudar os gerentes a desenvolver estratégias e práticas capazes de alavancar o desempenho em inovação.
Resumo:
Tests on printed circuit boards and integrated circuits are widely used in industry,resulting in reduced design time and cost of a project. The functional and connectivity tests in this type of circuits soon began to be a concern for the manufacturers, leading to research for solutions that would allow a reliable, quick, cheap and universal solution. Initially, using test schemes were based on a set of needles that was connected to inputs and outputs of the integrated circuit board (bed-of-nails), to which signals were applied, in order to verify whether the circuit was according to the specifications and could be assembled in the production line. With the development of projects, circuit miniaturization, improvement of the production processes, improvement of the materials used, as well as the increase in the number of circuits, it was necessary to search for another solution. Thus Boundary-Scan Testing was developed which operates on the border of integrated circuits and allows testing the connectivity of the input and the output ports of a circuit. The Boundary-Scan Testing method was converted into a standard, in 1990, by the IEEE organization, being known as the IEEE 1149.1 Standard. Since then a large number of manufacturers have adopted this standard in their products. This master thesis has, as main objective: the design of Boundary-Scan Testing in an image sensor in CMOS technology, analyzing the standard requirements, the process used in the prototype production, developing the design and layout of Boundary-Scan and analyzing obtained results after production. Chapter 1 presents briefly the evolution of testing procedures used in industry, developments and applications of image sensors and the motivation for the use of architecture Boundary-Scan Testing. Chapter 2 explores the fundamentals of Boundary-Scan Testing and image sensors, starting with the Boundary-Scan architecture defined in the Standard, where functional blocks are analyzed. This understanding is necessary to implement the design on an image sensor. It also explains the architecture of image sensors currently used, focusing on sensors with a large number of inputs and outputs.Chapter 3 describes the design of the Boundary-Scan implemented and starts to analyse the design and functions of the prototype, the used software, the designs and simulations of the functional blocks of the Boundary-Scan implemented. Chapter 4 presents the layout process used based on the design developed on chapter 3, describing the software used for this purpose, the planning of the layout location (floorplan) and its dimensions, the layout of individual blocks, checks in terms of layout rules, the comparison with the final design and finally the simulation. Chapter 5 describes how the functional tests were performed to verify the design compliancy with the specifications of Standard IEEE 1149.1. These tests were focused on the application of signals to input and output ports of the produced prototype. Chapter 6 presents the conclusions that were taken throughout the execution of the work.
Resumo:
The present study aimed to understand how and to what extent the electronic forró, currently hegemonic in the music market in the state of Rio Grande do Norte, establishes and maintains relations of domination in the social contexts in which it is produced, transmitted and received. Based, in significant form-content, on the writings of the first generation of theorists of the so-called Frankfurt School (Critical Theory), particularly with Theodor W. Adorno, and systematically using the contributions of the Cultural Studies (from the Centre for Contemporary Cultural Studies of Birmingham) and of the sociology of Pierre Bourdieu, this study aimed to perform, in the fertile intersection of these references, a critical possibility of interpretation of the electronic forró predominantly spread in the state of Rio Grande do Norte. To this end, aiming at a better apprehension of the so-called capital circuits/culture circuits , this study resulted from a qualitative investment of research, based on structured interviews with musicians, entrepreneurs of the sector and music consumers, as well as on the analysis of the themes contained in the official discography of the electronic forró band called Garota Safada (Shameless Girl). As a general empirical conclusion, it was possible to infer that far from the significant presence of domination or mere prevalence of oppositions, there is a relational pluralism of forms of domination and ways of resistances present in the production and consumption of electronic forró, regardless of gender, age, income, education or place of residence. However, the artifices of the cultural industry has been shown to be efficient: from large-scale businessmen to small producers enabled by the so-called open markets . The currentness of the concept of cultural industry is based on the idea that its products are offered systematically (the systematic insistence of everything to everyone) and on the notion that its production primarily meets the administrative criteria of control over the effects on the receiver (capacity of prescription of desires). Thus, the Adornian reflection on the pseudo-individualization leads to the inference that even in some of the most apparent ways of negotiation and/or refusal regarding the consumption of forró, certain behaviors of the cultural industry still prevail both in the very (re)interpretation of the forró and in the choice of other music genres also standardized, rationalized and massified. Therefore, despite the absence of cause-effect relation and the recognition of the popular capacity of re-elaboration and contestation of the media consumption, some world views prevailing in relation to the electronic forró establish or, at least, support some hegemonic ideologies, especially those concerning the life style, consumption and genre relations (fun by all means). Therefore, due the massification of certain songs, some ways of dissemination of values, beliefs and feelings are potentially experienced from the electronic forró. So, it is presumable that in the current advance of the process of semiformation (Halbbildung), the habitus of a part of the youth from the state of Rio Grande do Norte reinforces and is reinforced by the centrality of the trinomial fun, love and sex present in the songs, emphasized in some constructive practices of sense and in certain flows of social significance
Resumo:
In socio-environmental scenario increased the nature resources concern beyond products and subproducts reuse. Recycling is the approach for a material or energy reintroducing in productive system. This method allows the reduction of garbage volume dumped in environment, saving energy and decreasing the requirement of natural resources use. In general, the ending of expanded polystyrene is deposited sanitary landfills or garbage dumps without control that take large volume and spreads easily by aeolian action, with consequently environmental pollution, however, the recycling avoids their misuse and the obtainment from petroleum is reduced. This work recycled expanded polystyrene via merger and/or dissolution by solvents for the production of integrated circuits boards. The obtained material was characterized in flexural mode according to ASTM D 790 and results were compared with phenolite, traditionally used. Specimens fractures were observed by electronic microscopy scanning in order to establish patterns. Expanded Polyestirene recycled as well as phenolite were also thermo analyzed by TGA and DSC. The method using dissolution produced very brittle materials. The method using merger showed no voids formation nor increased the brittleness of the material. The recycled polystyrene presented a strength value significantly lower than that for the phenolite. (C) 2011 Published by Elsevier Ltd. Selection and peer-review under responsibility of ICM11
Resumo:
The approach Software Product Line (SPL) has become very promising these days, since it allows the production of customized systems on large scale through product families. For the modeling of these families the Features Model is being widely used, however, it is a model that has low level of detail and not may be sufficient to guide the development team of LPS. Thus, it is recommended add the Features Model to other models representing the system from other perspectives. The goals model PL-AOVgraph can assume this role complementary to the Features Model, since it has a to context oriented language of LPS's, which allows the requirements modeling in detail and identification of crosscutting concerns that may arise as result of variability. In order to insert PL-AOVgraph in development of LPS's, this paper proposes a bi-directional mapping between PL-AOVgraph and Features Model, which will be automated by tool ReqSys-MDD. This tool uses the approach of Model-Driven Development (MDD), which allows the construction of systems from high level models through successive transformations. This enables the integration of ReqSys-MDD with other tools MDD that use their output models as input to other transformations. So it is possible keep consistency among the models involved, avoiding loss of informations on transitions between stages of development