13 resultados para DEFECT CENTRES

em Universidad Politécnica de Madrid


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The detailed study of the deterioration suffered by the materials of the components of a nuclear facility, in particular those forming part of the reactor core, is a topic of great interest which importance derives in large technological and economic implications. Since changes in the atomic-structural properties of relevant components pose a risk to the smooth operation with clear consequences for security and life of the plant, controlling these factors is essential in any development of engineering design and implementation. In recent times, tungsten has been proposed as a structural material based on its good resistance to radiation, but still needs to be done an extensive study on the influence of temperature on the behavior of this material under radiation damage. This work aims to contribute in this regard. Molecular Dynamics (MD) simulations were carried out to determine the influence of temperature fluctuations on radiation damage production and evolution in Tungsten. We have particularly focused our study in the dynamics of defect creation, recombination, and diffusion properties. PKA energies were sampled in a range from 5 to 50 KeV. Three different temperature scenarios were analyzed, from very low temperatures (0-200K), up to high temperature conditions (300-500 K). We studied the creation of defects, vacancies and interstitials, recombination rates, diffusion properties, cluster formation, their size and evolution. Simulations were performed using Lammps and the Zhou EAM potential for W

Relevância:

20.00% 20.00%

Publicador:

Resumo:

streets in local residential areas in large cities, real traffic tests for pollutant emissions and fuel consumption have been carried out in Madrid city centre. Emission concentration and car activity were simultaneously measured by a Portable Emissions Measurement System. Real life tests carried out at different times and on different days were performed with a turbo-diesel engine light vehicle equipped with an oxidizer catalyst and using different driving styles with a previously trained driver. The results show that by reducing the speed limit from 50 km h-1 to 30 km h-1, using a normal driving style, the time taken for a given trip does not increase, but fuel consumption and NOx, CO and PM emissions are clearly reduced. Therefore, the main conclusion of this work is that reducing the speed limit in some narrow streets in residential and commercial areas or in a city not only increases pedestrian safety, but also contributes to reducing the environmental impact of motor vehicles and reducing fuel consumption. In addition, there is also a reduction in the greenhouse gas emissions resulting from the combustion of the fuel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses how the internal resources of small- and medium-sized enterprises determine access (learning processes) to technology centres (TCs) or industrial research institutes (innovation infrastructure) in traditional low-tech clusters. These interactions basically represent traded (market-based) transactions, which constitute important sources of knowledge in clusters. The paper addresses the role of TCs in low-tech clusters, and uses semi-structured interviews with 80 firms in a manufacturing cluster. The results point out that producer–user interactions are the most frequent; thus, the higher the sector knowledge-intensive base, the more likely the utilization of the available research infrastructure becomes. Conversely, the sectors with less knowledge-intensive structures, i.e. less absorptive capacity (AC), present weak linkages to TCs, as they frequently prefer to interact with suppliers, who act as transceivers of knowledge. Therefore, not all the firms in a cluster can fully exploit the available research infrastructure, and their AC moderates this engagement. In addition, the existence of TCs is not sufficient since the active role of a firm's search strategies to undertake interactions and conduct openness to available sources of knowledge is also needed. The study has implications for policymakers and academia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most empirical disciplines promote the reuse and sharing of datasets, as it leads to greater possibility of replication. While this is increasingly the case in Empirical Software Engineering, some of the most popular bug-fix datasets are now known to be biased. This raises two significants concerns: first, that sample bias may lead to underperforming prediction models, and second, that the external validity of the studies based on biased datasets may be suspect. This issue has raised considerable consternation in the ESE literature in recent years. However, there is a confounding factor of these datasets that has not been examined carefully: size. Biased datasets are sampling only some of the data that could be sampled, and doing so in a biased fashion; but biased samples could be smaller, or larger. Smaller data sets in general provide less reliable bases for estimating models, and thus could lead to inferior model performance. In this setting, we ask the question, what affects performance more? bias, or size? We conduct a detailed, large-scale meta-analysis, using simulated datasets sampled with bias from a high-quality dataset which is relatively free of bias. Our results suggest that size always matters just as much bias direction, and in fact much more than bias direction when considering information-retrieval measures such as AUC and F-score. This indicates that at least for prediction models, even when dealing with sampling bias, simply finding larger samples can sometimes be sufficient. Our analysis also exposes the complexity of the bias issue, and raises further issues to be explored in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work implements an optimization of the phosphorus gettering effect during the contact co-firing step by means of both simulations and experiments in an industrial belt furnace. An optimized temperature profile, named ‘extended co-firing step’, is presented. Simulations show that the effect of the short annealing on the final interstitial iron concentration depends strongly on the initial contamination level of the material and that the ‘extended co-firing’ temperature profile can enhance the gettering effect within a small additional time. Experimental results using sister wafers from the same multicrystalline silicon ingot confirm these trends and show the potential of this new defect engineering tool to improve the solar cell efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the stability and dynamics of non-Boussinesq convection in pure gases ?CO2 and SF6? with Prandtl numbers near Pr? 1 and in a H2-Xe mixture with Pr= 0.17. Focusing on the strongly nonlinear regime we employ Galerkin stability analyses and direct numerical simulations of the Navier-Stokes equations. For Pr ? 1 and intermediate non-Boussinesq effects we find reentrance of stable hexagons as the Rayleigh number is increased. For stronger non-Boussinesq effects the usual, transverse side-band instability is superseded by a longitudinal side-band instability. Moreover, the hexagons do not exhibit any amplitude instability to rolls. Seemingly, this result contradicts the experimentally observed transition from hexagons to rolls. We resolve this discrepancy by including the effect of the lateral walls. Non-Boussinesq effects modify the spiral defect chaos observed for larger Rayleigh numbers. For convection in SF6 we find that non-Boussinesq effects strongly increase the number of small, compact convection cells and with it enhance the cellular character of the patterns. In H2-Xe, closer to threshold, we find instead an enhanced tendency toward roll-like structures. In both cases the number of spirals and of targetlike components is reduced. We quantify these effects using recently developed diagnostics of the geometric properties of the patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivated by the observation of spiral patterns in a wide range of physical, chemical, and biological systems, we present an automated approach that aims at characterizing quantitatively spiral-like elements in complex stripelike patterns. The approach provides the location of the spiral tip and the size of the spiral arms in terms of their arc length and their winding number. In addition, it yields the number of pattern components (Betti number of order 1), as well as their size and certain aspects of their shape. We apply the method to spiral defect chaos in thermally driven Rayleigh- Bénard convection and find that the arc length of spirals decreases monotonically with decreasing Prandtl number of the fluid and increasing heating. By contrast, the winding number of the spirals is nonmonotonic in the heating. The distribution function for the number of spirals is significantly narrower than a Poisson distribution. The distribution function for the winding number shows approximately an exponential decay. It depends only weakly on the heating, but strongly on the Prandtl number. Large spirals arise only for larger Prandtl numbers. In this regime the joint distribution for the spiral length and the winding number exhibits a three-peak structure, indicating the dominance of Archimedean spirals of opposite sign and relatively straight sections. For small Prandtl numbers the distribution function reveals a large number of small compact pattern components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We employ numerical computations of the full Navier-Stokes equations to investigate non-Boussinesq convection in a rotating system using water as the working fluid. We identify two regimes. For weak non- Boussinesq effects the Hopf bifurcation from steady to oscillating (whirling) hexagons is supercritical and typical states exhibit defect chaos that is systematically described by the cubic complex Ginzburg-Landau equation. For stronger non-Boussinesq effects the Hopf bifurcation becomes subcritical and the oscil- lations exhibit localized chaotic bursting, which is modeled by a quintic complex Ginzburg-Landau equation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When the fresh fruit reaches the final markets from the suppliers, its quality is not always as good as it should, either because it has been mishandled during transportation or because it lacks an adequate quality control at the producer level, before being shipped. This is why it is necessary for the final markets to establish their own quality assessment system if they want to ensure to their customers the quality they want to sell. In this work, a system to control fruit quality at the last level of the distribution channel has been designed. The system combines rapid control techniques with laboratory equipment and statistical sampling protocols, to obtain a dynamic, objective process, which can substitute advantageously the quality control inspections carried out visually by human experts at the reception platform of most hypermarkets. Portable measuring equipment have been chosen (firmness tester, temperature and humidity sensors...) as well as easy-to-use laboratory equipment (texturometer, colorimeter, refractometer..,) combining them to control the most important fruit quality parameters (firmness, colour, sugars, acids). A complete computer network has been designed to control all the processes and store the collected data in real time, and to perform the computations. The sampling methods have been also defined to guarantee the confidence of the results. Some of the advantages of a quality assessment system as the proposed one are: the minimisation of human subjectivity, the ability to use modern measuring techniques, and the possibility of using it also as a supplier's quality control system. It can be also a way to clarify the quality limits of fruits among members of the commercial channel, as well as the first step in the standardisation of quality control procedures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we introduce the Object Kinetic Monte Carlo (OKMC) simulator MMonCa and simulate the defect evolution in three different materials. We start by explaining the theory of OKMC and showing some details of how such theory is implemented by creating generic structures and algorithms in the objects that we want to simulate. Then we successfully reproduce simulated results for defect evolution in iron, silicon and tungsten using our simulator and compare with available experimental data and similar simulations. The comparisons validate MMonCa showing that it is powerful and flexible enough to be customized and used to study the damage evolution of defects in a wide range of solid materials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The era of the seed-cast grown monocrystalline-based silicon ingots is coming. Mono-like, pseudomono or quasimono wafers are product labels that can be nowadays found in the market, as a critical innovation for the photovoltaic industry. They integrate some of the most favorable features of the conventional silicon substrates for solar cells, so far, such as the high solar cell efficiency offered by the monocrystalline Czochralski-Si (Cz-Si) wafers and the lower cost, high productivity and full square-shape that characterize the well-known multicrystalline casting growth method. Nevertheless, this innovative crystal growth approach still faces a number of mass scale problems that need to be resolved, in order to gain a deep, 100% reliable and worldwide market: (i) extended defects formation during the growth process; (ii) optimization of the seed recycling; and (iii) parts of the ingots giving low solar cells performance, which directly affect the production costs and yield of this approach. Therefore, this paper presents a series of casting crystal growth experiments and characterization studies from ingots, wafers and cells manufactured in an industrial approach, showing the main sources of crystal defect formation, impurity enrichment and potential consequences at solar cell level. The previously mentioned technological drawbacks are directly addressed, proposing industrial actions to pave the way of this new wafer technology to high efficiency solar cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Con el auge del Cloud Computing, las aplicaciones de proceso de datos han sufrido un incremento de demanda, y por ello ha cobrado importancia lograr m�ás eficiencia en los Centros de Proceso de datos. El objetivo de este trabajo es la obtenci�ón de herramientas que permitan analizar la viabilidad y rentabilidad de diseñar Centros de Datos especializados para procesamiento de datos, con una arquitectura, sistemas de refrigeraci�ón, etc. adaptados. Algunas aplicaciones de procesamiento de datos se benefician de las arquitecturas software, mientras que en otras puede ser m�ás eficiente un procesamiento con arquitectura hardware. Debido a que ya hay software con muy buenos resultados en el procesamiento de grafos, como el sistema XPregel, en este proyecto se realizará una arquitectura hardware en VHDL, implementando el algoritmo PageRank de Google de forma escalable. Se ha escogido este algoritmo ya que podr��á ser m�ás eficiente en arquitectura hardware, debido a sus características concretas que se indicaráan m�ás adelante. PageRank sirve para ordenar las p�áginas por su relevancia en la web, utilizando para ello la teorí��a de grafos, siendo cada página web un vértice de un grafo; y los enlaces entre páginas, las aristas del citado grafo. En este proyecto, primero se realizará un an�álisis del estado de la técnica. Se supone que la implementaci�ón en XPregel, un sistema de procesamiento de grafos, es una de las m�ás eficientes. Por ello se estudiará esta �ultima implementaci�ón. Sin embargo, debido a que Xpregel procesa, en general, algoritmos que trabajan con grafos; no tiene en cuenta ciertas caracterí��sticas del algoritmo PageRank, por lo que la implementaci�on no es �optima. Esto es debido a que en PageRank, almacenar todos los datos que manda un mismo v�értice es un gasto innecesario de memoria ya que todos los mensajes que manda un vértice son iguales entre sí e iguales a su PageRank. Se realizará el diseño en VHDL teniendo en cuenta esta caracter��ística del citado algoritmo,evitando almacenar varias veces los mensajes que son iguales. Se ha elegido implementar PageRank en VHDL porque actualmente las arquitecturas de los sistemas operativos no escalan adecuadamente. Se busca evaluar si con otra arquitectura se obtienen mejores resultados. Se realizará un diseño partiendo de cero, utilizando la memoria ROM de IPcore de Xillinx (Software de desarrollo en VHDL), generada autom�áticamente. Se considera hacer cuatro tipos de módulos para que as�� el procesamiento se pueda hacer en paralelo. Se simplificar�á la estructura de XPregel con el fin de intentar aprovechar la particularidad de PageRank mencionada, que hace que XPregel no le saque el m�aximo partido. Despu�és se escribirá el c�ódigo, realizando una estructura escalable, ya que en la computación intervienen millones de páginas web. A continuación, se sintetizar�á y se probará el código en una FPGA. El �ultimo paso será una evaluaci�ón de la implementaci�ón, y de posibles mejoras en cuanto al consumo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The preservation of tangible cultural heritage does not guarantee effective revitalisation of urban historic areas as a whole. The legacy of our history consists not only of paintings, sculptures, architectural monuments and public spaces, but also the safeguarding of immaterial aspects of social life, such as oral traditions, rituals, practices, knowledge and craft skills. From 1999 to 2013, 26 Brazilian cities benefited from the Monumenta Programme - a national cultural policy that involved institutions, the private sector and the local community. The purpose of the programme was to stimulate economic growth and increase cultural and social development of the historic centres. Moreover, it sought to increase the number of residents in the benefited areas as defined in its agenda (IDB, 1999; MinC & Programa Monumenta, 2006). Using the Historic Centre of Porto Alegre as a case study, this paper examines how this cultural programme enables demographic change through the promotion of intangible cultural heritage, e.g. by supporting educational projects. The demographic flow was analysed using the microdata of the Populations Censuses (years 2000 and 2010) available from the Brazilian Institute of Geography and Statistics. The results showed an increase in low-income residents the areas that participated in the programme. This increase may have been motivated by a set of cultural-educational projects under the auspices of the Monumenta Programme. The retraining of artisans of Alfândega Square, the training of low-income youth for restoration work and the implementation of the "Black Route Museum in Porto Alegre" (Bicca, 2010) are just some examples of what was done to improve the local community's economy, to encourage social cohesion and to enhance the awareness of cultural diversity as a positive and essential value in society.