998 resultados para Particle accelerators
Resumo:
At sufficiently high laser intensities, the rapid heating to relativistic velocities and resulting decompression of plasma electrons in an ultra-thin target foil can result in the target becoming relativistically transparent to the laser light during the interaction. Ion acceleration in this regime is strongly affected by the transition from an opaque to a relativistically transparent plasma. By spatially resolving the laser-accelerated proton beam at near-normal laser incidence and at an incidence angle of 30°, we identify characteristic features both experimentally and in particle-in-cell simulations which are consistent with the onset of three distinct ion acceleration mechanisms: sheath acceleration; radiation pressure acceleration; and transparency-enhanced acceleration. The latter mechanism occurs late in the interaction and is mediated by the formation of a plasma jet extending into the expanding ion population. The effect of laser incident angle on the plasma jet is explored.
Resumo:
All-optical approaches to particle acceleration are currently attracting a significant research effort internationally. Although characterized by exceptional transverse and longitudinal emittance, laser-driven ion beams currently have limitations in terms of peak ion energy, bandwidth of the energy spectrum and beam divergence. Here we introduce the concept of a versatile, miniature linear accelerating module, which, by employing laser-excited electromagnetic pulses directed along a helical path surrounding the laser-accelerated ion beams, addresses these shortcomings simultaneously. In a proof-of-principle experiment on a university-scale system, we demonstrate post-acceleration of laser-driven protons from a flat foil at a rate of 0.5 GeVm^-1, already beyond what can be sustained by conventional accelerator technologies, with dynamic beam collimation and energy selection. These results open up new opportunities for the development of extremely compact and cost-effective ion accelerators for both established and innovative applications.
Resumo:
As a leading facility in laser-driven nuclear physics, ELI-NP will develop innovative research in the fields of materials behavior in extreme environments and radiobiology, with applications in the development of accelerator components, new materials for next generation fusion and fission reactors, shielding solutions for equipment and human crew in long term space missions and new biomedical technologies. The specific properties of the laser-driven radiation produced with two lasers of 1 PW at a pulse repetition rate of 1 Hz each are an ultra-short time scale, a relatively broadband spectrum and the possibility to provide simultaneously several types of radiation. Complex, cosmic-like radiation will be produced in a ground-based laboratory allowing comprehensive investigations of their effects on materials and biological systems. The expected maximum energy and intensity of the radiation beams are 19 MeV with 10^9 photon/pulse for photon radiation, 2 GeV with 108 electron/pulse for electron beams, 60 MeV with 10^12 proton/pulse for proton and ion beams and 60 MeV with 107 neutron/pulse for a neutron source. Research efforts will be directed also towards measurements for radioprotection of the prompt and activated dose, as a function of laser and target characteristics and to the development and testing of various dosimetric methods and equipment.
Resumo:
The biological effectiveness of laser driven protons on cells at high dose rate in a single exposure has been studied. V79 cell lines were irradiated with laser driven protons.
Resumo:
O desenvolvimento de sistemas computacionais é um processo complexo, com múltiplas etapas, que requer uma análise profunda do problema, levando em consideração as limitações e os requisitos aplicáveis. Tal tarefa envolve a exploração de técnicas alternativas e de algoritmos computacionais para optimizar o sistema e satisfazer os requisitos estabelecidos. Neste contexto, uma das mais importantes etapas é a análise e implementação de algoritmos computacionais. Enormes avanços tecnológicos no âmbito das FPGAs (Field-Programmable Gate Arrays) tornaram possível o desenvolvimento de sistemas de engenharia extremamente complexos. Contudo, o número de transístores disponíveis por chip está a crescer mais rapidamente do que a capacidade que temos para desenvolver sistemas que tirem proveito desse crescimento. Esta limitação já bem conhecida, antes de se revelar com FPGAs, já se verificava com ASICs (Application-Specific Integrated Circuits) e tem vindo a aumentar continuamente. O desenvolvimento de sistemas com base em FPGAs de alta capacidade envolve uma grande variedade de ferramentas, incluindo métodos para a implementação eficiente de algoritmos computacionais. Esta tese pretende proporcionar uma contribuição nesta área, tirando partido da reutilização, do aumento do nível de abstracção e de especificações algorítmicas mais automatizadas e claras. Mais especificamente, é apresentado um estudo que foi levado a cabo no sentido de obter critérios relativos à implementação em hardware de algoritmos recursivos versus iterativos. Depois de serem apresentadas algumas das estratégias para implementar recursividade em hardware mais significativas, descreve-se, em pormenor, um conjunto de algoritmos para resolver problemas de pesquisa combinatória (considerados enquanto exemplos de aplicação). Versões recursivas e iterativas destes algoritmos foram implementados e testados em FPGA. Com base nos resultados obtidos, é feita uma cuidada análise comparativa. Novas ferramentas e técnicas de investigação que foram desenvolvidas no âmbito desta tese são também discutidas e demonstradas.
Resumo:
Os incêndios florestais são uma importante fonte de emissão de compostos gasosos e de aerossóis. Em Portugal, onde a maioria dos incêndios ocorre no norte e centro do país, os incêndios destroem todos os anos milhares de hectares, com importantes perdas em termos económicos, de vidas humanas e qualidade ambiental. As emissões podem alterar consideravelmente a química da atmosfera, degradar a qualidade do ar e alterar o clima. Contudo, a informação sobre as caraterísticas das emissões dos incêndios florestais nos países do Mediterrâneo é limitada. Tanto a nível nacional como internacional, existe um interesse crescente na elaboração de inventários de emissões e de regulamentos sobre as emissões de carbono para a atmosfera. Do ponto de vista atmosférico da monitorização atmosférica, os incêndios são considerados um desafio, dada a sua variabilidade temporal e espacial, sendo de esperar um aumento da sua frequência, dimensão e severidade, e também porque as estimativas de emissões dependem das caraterísticas dos biocombustíveis e da fase de combustão. O objetivo deste estudo foi quantificar e caraterizar as emissões de gases e aerossóis de alguns dos mais representativos incêndios florestais que ocorreram no centro de Portugal nos verões de 2009 e de 2010. Efetuou-se a colheita de amostras de gases e de duas frações de partículas (PM2.5 e PM2.5-10) nas plumas de fumo em sacos Tedlar e em filtros de quartzo acoplados a um amostrador de elevado volume, respetivamente. Os hidrocarbonetos totais (THC) e óxidos de carbono (CO e CO2) nas amostras gasosas foram analisados em instrumentos automáticos de ionização de chama e detetores não dispersivos de infravermelhos, respetivamente. Para algumas amostras, foram também quantificados alguns compostos de carbonilo após reamostragem do gás dos sacos Tedlar em cartuchos de sílica gel revestidos com 2,4-dinitrofenilhidrazina (DNPH), seguida de análise por cromatografia líquida de alta resolução. Nas partículas, analisou-se o carbono orgânico e elementar (técnica termo-óptica), iões solúveis em água (cromatografia iónica) e elementos (espectrometria de massa com plasma acoplado por indução ou análise instrumental por ativação com neutrões). A especiação orgânica foi obtida por cromatografia gasosa acoplada a espectrometria de massa após extração com recurso a vários solventes e separação dos extratos orgânicos em diversas classes de diferentes polaridades através do fracionamento com sílica gel. Os fatores de emissão do CO e do CO2 situaram-se nas gamas 52-482 e 822-1690 g kg-1 (base seca), mostrando, respetivamente, correlação negativa e positiva com a eficiência de combustão. Os fatores de emissão dos THC apresentaram valores mais elevados durante a fase de combustão latente sem chama, oscilando entre 0.33 e 334 g kg-1 (base seca). O composto orgânico volátil oxigenado mais abundante foi o acetaldeído com fatores de emissão que variaram desde 1.0 até 3.2 g kg-1 (base seca), seguido pelo formaldeído e o propionaldeído. Observou-se que as emissões destes compostos são promovidas durante a fase de combustão latente sem chama. Os fatores de emissão de PM2.5 e PM10 registaram valores entre 0.50-68 e 0.86-72 g kg-1 (base seca), respetivamente. A emissão de partículas finas e grosseiras é também promovida em condições de combustão lenta. As PM2.5 representaram cerca de 90% da massa de partículas PM10. A fração carbonosa das partículas amostradas em qualquer dos incêndios foi claramente dominada pelo carbono orgânico. Foi obtida uma ampla gama de rácios entre o carbono orgânico e o carbono elementar, dependendo das condições de combustão. Contudo, todos os rácios refletiram uma maior proporção de carbono orgânico em relação ao carbono elementar, típica das emissões de queima de biomassa. Os iões solúveis em água obtidos nas partículas da pluma de fumo contribuíram com valores até 3.9% da massa de partículas PM2.5 e 2.8% da massa de partículas de PM2.5-10. O potássio contribuiu com valores até 15 g mg-1 PM2.5 e 22 g mg-1 PM2.5-10, embora em massa absoluta estivesse maioritariamente presente nas partículas finas. Os rácios entre potássio e carbono elementar e entre potássio e carbono orgânico obtidos nas partículas da pluma de fumo enquadram-se na gama de valores relatados na literatura para emissões de queima de biomassa. Os elementos detetados nas amostras representaram, em média, valores até 1.2% e 12% da massa de PM2.5 e PM2.5-10, respetivamente. Partículas resultantes de uma combustão mais completa (valores elevados de CO2 e baixos de CO) foram caraterizadas por um elevado teor de constituintes inorgânicos e um menor conteúdo de matéria orgânica. Observou-se que a matéria orgânica particulada é composta principalmente por componentes fenólicos e produtos derivados, séries de compostos homólogos (alcanos, alcenos, ácidos alcanóicos e alcanóis), açúcares, biomarcadores esteróides e terpenóides, e hidrocarbonetos aromáticos policíclicos. O reteno, um biomarcador das emissões da queima de coníferas, foi o hidrocarboneto aromático dominante nas amostras das plumas de fumo amostradas durante a campanha que decorreu em 2009, devido ao predomínio de amostras colhidas em incêndios em florestas de pinheiros. O principal açúcar anidro, e sempre um dos compostos mais abundantes, foi o levoglucosano. O rácio levoglucosano/OC obtido nas partículas das plumas de fumo, em média, registaram valores desde 5.8 a 23 mg g-1 OC. Os rácios levoglucosano/manosano e levoglucosano/(manosano+galactosano) revelaram o predomínio de amostras provenientes da queima de coníferas. Tendo em conta que a estimativa das emissões dos incêndios florestais requer um conhecimento de fatores de emissão apropriados para cada biocombustível, a base de dados abrangente obtida neste estudo é potencialmente útil para atualizar os inventários de emissões. Tem vindo a ser observado que a fase de combustão latente sem chama, a qual pode ocorrer simultaneamente com a fase de chama e durar várias horas ou dias, pode contribuir para uma quantidade considerável de poluentes atmosféricos, pelo que os fatores de emissão correspondentes devem ser considerados no cálculo das emissões globais de incêndios florestais. Devido à falta de informação detalhada sobre perfis químicos de emissão, a base de dados obtida neste estudo pode também ser útil para a aplicação de modelos no recetor no sul da Europa.
Resumo:
This paper describes the first use of inter-particle force measurement in reworked aerosols to better understand the mechanics of dust deflation and its consequent ecological ramifications. Dust is likely to carry hydrocarbons and micro-organisms including human pathogens and cultured microbes and thereby is a threat to plants, animals and human. Present-day global aerosol emissions are substantially greater than in 1850; however, the projected influx rates are highly disputable. This uncertainty, in part, has roots in the lack of understanding of deflation mechanisms. A growing body of literature shows that whether carbon emission continues to increase, plant transpiration drops and soil water retention enhances, allowing more greenery to grow and less dust to flux. On the other hand, a small but important body of geochemistry literature shows that increasing emission and global temperature leads to extreme climates, decalcification of surface soils containing soluble carbonate polymorphs and hence a greater chance of deflation. The consistency of loosely packed reworked silt provides background data against which the resistance of dust’s bonding components (carbonates and water) can be compared. The use of macro-scale phenomenological approaches to measure dust consistency is trivial. Instead, consistency can be measured in terms of inter-particle stress state. This paper describes a semi-empirical parametrisation of the inter-particle cohesion forces in terms of the balance of contact-level forces at the instant of particle motion. We put forward the hypothesis that the loss of Ca2+-based pedogenic salts is responsible for much of the dust influx and surficial drying pays a less significant role.
Resumo:
There are many species among the Alternaria genus, which hosts on economically important crops causing significant yield losses. Less attention has been paid to fungi hosting on plants constituting substantial components of pastures and meadows. Alternaria spp. spores are also recognised as important allergens. A 7-day volumetric spore trap was used to monitor the concentration of airborne fungal spores. Air samples were collected in Worcester, England (2006–2010). Days with a high spore count were then selected. The longest episode that occurred within a five year study was chosen for modelling. Two source maps presenting distribution of crops under rotation and pastures in the UK were produced. Back trajectories were calculated using the HYSPLIT model. In ArcGIS clusters of trajectories were studied in connection with source maps by including the height above ground level and the speed of the air masses. During the episode no evidence for a long distance transport from the continent of Alternaria spp. spores was detected. The overall direction of the air masses fell within the range from South-West to North. The back trajectories indicated that the most important sources of Alternaria spp. spores were located in the West Midlands of England.
Resumo:
This paper addresses the problem of energy resources management using modern metaheuristics approaches, namely Particle Swarm Optimization (PSO), New Particle Swarm Optimization (NPSO) and Evolutionary Particle Swarm Optimization (EPSO). The addressed problem in this research paper is intended for aggregators’ use operating in a smart grid context, dealing with Distributed Generation (DG), and gridable vehicles intelligently managed on a multi-period basis according to its users’ profiles and requirements. The aggregator can also purchase additional energy from external suppliers. The paper includes a case study considering a 30 kV distribution network with one substation, 180 buses and 90 load points. The distribution network in the case study considers intense penetration of DG, including 116 units from several technologies, and one external supplier. A scenario of 6000 EVs for the given network is simulated during 24 periods, corresponding to one day. The results of the application of the PSO approaches to this case study are discussed deep in the paper.
Resumo:
This paper proposes a particle swarm optimization (PSO) approach to support electricity producers for multiperiod optimal contract allocation. The producer risk preference is stated by a utility function (U) expressing the tradeoff between the expectation and variance of the return. Variance estimation and expected return are based on a forecasted scenario interval determined by a price range forecasting model developed by the authors. A certain confidence level is associated to each forecasted scenario interval. The proposed model makes use of contracts with physical (spot and forward) and financial (options) settlement. PSO performance was evaluated by comparing it with a genetic algorithm-based approach. This model can be used by producers in deregulated electricity markets but can easily be adapted to load serving entities and retailers. Moreover, it can easily be adapted to the use of other type of contracts.
Resumo:
Distributed Energy Resources (DER) scheduling in smart grids presents a new challenge to system operators. The increase of new resources, such as storage systems and demand response programs, results in additional computational efforts for optimization problems. On the other hand, since natural resources, such as wind and sun, can only be precisely forecasted with small anticipation, short-term scheduling is especially relevant requiring a very good performance on large dimension problems. Traditional techniques such as Mixed-Integer Non-Linear Programming (MINLP) do not cope well with large scale problems. This type of problems can be appropriately addressed by metaheuristics approaches. This paper proposes a new methodology called Signaled Particle Swarm Optimization (SiPSO) to address the energy resources management problem in the scope of smart grids, with intensive use of DER. The proposed methodology’s performance is illustrated by a case study with 99 distributed generators, 208 loads, and 27 storage units. The results are compared with those obtained in other methodologies, namely MINLP, Genetic Algorithm, original Particle Swarm Optimization (PSO), Evolutionary PSO, and New PSO. SiPSO performance is superior to the other tested PSO variants, demonstrating its adequacy to solve large dimension problems which require a decision in a short period of time.
Resumo:
Short-term risk management is highly dependent on long-term contractual decisions previously established; risk aversion factor of the agent and short-term price forecast accuracy. Trying to give answers to that problem, this paper provides a different approach for short-term risk management on electricity markets. Based on long-term contractual decisions and making use of a price range forecast method developed by the authors, the short-term risk management tool presented here has as main concern to find the optimal spot market strategies that a producer should have for a specific day in function of his risk aversion factor, with the objective to maximize the profits and simultaneously to practice the hedge against price market volatility. Due to the complexity of the optimization problem, the authors make use of Particle Swarm Optimization (PSO) to find the optimal solution. Results from realistic data, namely from OMEL electricity market, are presented and discussed in detail.
Resumo:
The concept of demand response has a growing importance in the context of the future power systems. Demand response can be seen as a resource like distributed generation, storage, electric vehicles, etc. All these resources require the existence of an infrastructure able to give players the means to operate and use them in an efficient way. This infrastructure implements in practice the smart grid concept, and should accommodate a large number of diverse types of players in the context of a competitive business environment. In this paper, demand response is optimally scheduled jointly with other resources such as distributed generation units and the energy provided by the electricity market, minimizing the operation costs from the point of view of a virtual power player, who manages these resources and supplies the aggregated consumers. The optimal schedule is obtained using two approaches based on particle swarm optimization (with and without mutation) which are compared with a deterministic approach that is used as a reference methodology. A case study with two scenarios implemented in DemSi, a demand Response simulator developed by the authors, evidences the advantages of the use of the proposed particle swarm approaches.
Resumo:
Competitive electricity markets have arisen as a result of power-sector restructuration and power-system deregulation. The players participating in competitive electricity markets must define strategies and make decisions using all the available information and business opportunities.
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding he management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.