248 resultados para Lightning.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A numerical model for studying the influences of deep convective cloud systems on photochemistry was developed based on a non-hydrostatic meteorological model and chemistry from a global chemistry transport model. The transport of trace gases, the scavenging of soluble trace gases, and the influences of lightning produced nitrogen oxides (NOx=NO+NO2) on the local ozone-related photochemistry were investigated in a multi-day case study for an oceanic region located in the tropical western Pacific. Model runs considering influences of large scale flows, previously neglected in multi-day cloud resolving and single column model studies of tracer transport, yielded that the influence of the mesoscale subsidence (between clouds) on trace gas transport was considerably overestimated in these studies. The simulated vertical transport and scavenging of highly soluble tracers were found to depend on the initial profiles, reconciling contrasting results from two previous studies. Influences of the modeled uptake of trace gases by hydrometeors in the liquid and the ice phase were studied in some detail for a small number of atmospheric trace gases and novel aspects concerning the role of the retention coefficient (i.e. the fraction of a dissolved trace gas that is retained in the ice phase upon freezing) on the vertical transport of highly soluble gases were illuminated. Including lightning NOx production inside a 500 km 2-D model domain was found to be important for the NOx budget and caused small to moderate changes in the domain averaged ozone concentrations. A number of sensitivity studies yielded that the fraction of lightning associated NOx which was lost through photochemical reactions in the vicinity of the lightning source was considerable, but strongly depended on assumptions about the magnitude and the altitude of the lightning NOx source. In contrast to a suggestion from an earlier study, it was argued that the near zero upper tropospheric ozone mixing ratios which were observed close to the study region were most probably not caused by the formation of NO associated with lightning. Instead, it was argued in agreement with suggestions from other studies that the deep convective transport of ozone-poor air masses from the relatively unpolluted marine boundary layer, which have most likely been advected horizontally over relatively large distances (both before and after encountering deep convection) probably played a role. In particular, it was suggested that the ozone profiles observed during CEPEX (Central Equatorial Pacific Experiment) were strongly influenced by the deep convection and the larger scale flow which are associated with the intra-seasonal oscillation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

rnNitric oxide (NO) is important for several chemical processes in the atmosphere. Together with nitrogen dioxide (NO2 ) it is better known as nitrogen oxide (NOx ). NOx is crucial for the production and destruction of ozone. In several reactions it catalyzes the oxidation of methane and volatile organic compounds (VOCs) and in this context it is involved in the cycling of the hydroxyl radical (OH). OH is a reactive radical, capable of oxidizing most organic species. Therefore, OH is also called the “detergent” of the atmosphere. Nitric oxide originates from several sources: fossil fuel combustion, biomass burning, lightning and soils. Fossil fuel combustion is the largest source. The others are, depending on the reviewed literature, generally comparable to each other. The individual sources show a different temporal and spatial pattern in their magnitude of emission. Fossil fuel combustion is important in densely populated places, where NO from other sources is less important. In contrast NO emissions from soils (hereafter SNOx) or biomass burning are the dominant source of NOx in remote regions.rnBy applying an atmospheric chemistry global climate model (AC-GCM) I demonstrate that SNOx is responsible for a significant part of NOx in the atmosphere. Furthermore, it increases the O3 and OH mixing ratio substantially, leading to a ∼10% increase in the oxidizing efficiency of the atmosphere. Interestingly, through reduced O3 and OH mixing ratios in simulations without SNOx, the lifetime of NOx increases in regions with other dominating sources of NOx

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nitrogen is an essential nutrient. It is for human, animal and plants a constituent element of proteins and nucleic acids. Although the majority of the Earth’s atmosphere consists of elemental nitrogen (N2, 78 %) only a few microorganisms can use it directly. To be useful for higher plants and animals elemental nitrogen must be converted to a reactive oxidized form. This conversion happens within the nitrogen cycle by free-living microorganisms, symbiotic living Rhizobium bacteria or by lightning. Humans are able to synthesize reactive nitrogen through the Haber-Bosch process since the beginning of the 20th century. As a result food security of the world population could be improved noticeably. On the other side the increased nitrogen input results in acidification and eutrophication of ecosystems and in loss of biodiversity. Negative health effects arose for humans such as fine particulate matter and summer smog. Furthermore, reactive nitrogen plays a decisive role at atmospheric chemistry and global cycles of pollutants and nutritive substances.rnNitrogen monoxide (NO) and nitrogen dioxide (NO2) belong to the reactive trace gases and are grouped under the generic term NOx. They are important components of atmospheric oxidative processes and influence the lifetime of various less reactive greenhouse gases. NO and NO2 are generated amongst others at combustion process by oxidation of atmospheric nitrogen as well as by biological processes within soil. In atmosphere NO is converted very quickly into NO2. NO2 is than oxidized to nitrate (NO3-) and to nitric acid (HNO3), which bounds to aerosol particles. The bounded nitrate is finally washed out from atmosphere by dry and wet deposition. Catalytic reactions of NOx are an important part of atmospheric chemistry forming or decomposing tropospheric ozone (O3). In atmosphere NO, NO2 and O3 are in photosta¬tionary equilibrium, therefore it is referred as NO-NO2-O3 triad. At regions with elevated NO concentrations reactions with air pollutions can form NO2, altering equilibrium of ozone formation.rnThe essential nutrient nitrogen is taken up by plants mainly by dissolved NO3- entering the roots. Atmospheric nitrogen is oxidized to NO3- within soil via bacteria by nitrogen fixation or ammonium formation and nitrification. Additionally atmospheric NO2 uptake occurs directly by stomata. Inside the apoplast NO2 is disproportionated to nitrate and nitrite (NO2-), which can enter the plant metabolic processes. The enzymes nitrate and nitrite reductase convert nitrate and nitrite to ammonium (NH4+). NO2 gas exchange is controlled by pressure gradients inside the leaves, the stomatal aperture and leaf resistances. Plant stomatal regulation is affected by climate factors like light intensity, temperature and water vapor pressure deficit. rnThis thesis wants to contribute to the comprehension of the effects of vegetation in the atmospheric NO2 cycle and to discuss the NO2 compensation point concentration (mcomp,NO2). Therefore, NO2 exchange between the atmosphere and spruce (Picea abies) on leaf level was detected by a dynamic plant chamber system under labo¬ratory and field conditions. Measurements took place during the EGER project (June-July 2008). Additionally NO2 data collected during the ECHO project (July 2003) on oak (Quercus robur) were analyzed. The used measuring system allowed simultaneously determina¬tion of NO, NO2, O3, CO2 and H2O exchange rates. Calculations of NO, NO2 and O3 fluxes based on generally small differences (∆mi) measured between inlet and outlet of the chamber. Consequently a high accuracy and specificity of the analyzer is necessary. To achieve these requirements a highly specific NO/NO2 analyzer was used and the whole measurement system was optimized to an enduring measurement precision.rnData analysis resulted in a significant mcomp,NO2 only if statistical significance of ∆mi was detected. Consequently, significance of ∆mi was used as a data quality criterion. Photo-chemical reactions of the NO-NO2-O3 triad in the dynamic plant chamber’s volume must be considered for the determination of NO, NO2, O3 exchange rates, other¬wise deposition velocity (vdep,NO2) and mcomp,NO2 will be overestimated. No significant mcomp,NO2 for spruce could be determined under laboratory conditions, but under field conditions mcomp,NO2 could be identified between 0.17 and 0.65 ppb and vdep,NO2 between 0.07 and 0.42 mm s-1. Analyzing field data of oak, no NO2 compensation point concentration could be determined, vdep,NO2 ranged between 0.6 and 2.71 mm s-1. There is increasing indication that forests are mainly a sink for NO2 and potential NO2 emissions are low. Only when assuming high NO soil emissions, more NO2 can be formed by reaction with O3 than plants are able to take up. Under these circumstance forests can be a source for NO2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The so called cascading events, which lead to high-impact low-frequency scenarios are rising concern worldwide. A chain of events result in a major industrial accident with dreadful (and often unpredicted) consequences. Cascading events can be the result of the realization of an external threat, like a terrorist attack a natural disaster or of “domino effect”. During domino events the escalation of a primary accident is driven by the propagation of the primary event to nearby units, causing an overall increment of the accident severity and an increment of the risk associated to an industrial installation. Also natural disasters, like intense flooding, hurricanes, earthquake and lightning are found capable to enhance the risk of an industrial area, triggering loss of containment of hazardous materials and in major accidents. The scientific community usually refers to those accidents as “NaTechs”: natural events triggering industrial accidents. In this document, a state of the art of available approaches to the modelling, assessment, prevention and management of domino and NaTech events is described. On the other hand, the relevant work carried out during past studies still needs to be consolidated and completed, in order to be applicable in a real industrial framework. New methodologies, developed during my research activity, aimed at the quantitative assessment of domino and NaTech accidents are presented. The tools and methods provided within this very study had the aim to assist the progress toward a consolidated and universal methodology for the assessment and prevention of cascading events, contributing to enhance safety and sustainability of the chemical and process industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: First investigations of the interactions between weather and the incidence of acute myocardial infarctions date back to 1938. The early observation of a higher incidence of myocardial infarctions in the cold season could be confirmed in very different geographical regions and cohorts. While the influence of seasonal variations on the incidence of myocardial infarctions has been extensively documented, the impact of individual meteorological parameters on the disease has so far not been investigated systematically. Hence the present study intended to assess the impact of the essential variables of weather and climate on the incidence of myocardial infarctions. METHODS: The daily incidence of myocardial infarctions was calculated from a national hospitalization survey. The hourly weather and climate data were provided by the database of the national weather forecast. The epidemiological and meteorological data were correlated by multivariate analysis based on a generalized linear model assuming a log-link-function and a Poisson distribution. RESULTS: High ambient pressure, high pressure gradients, and heavy wind activity were associated with an increase in the incidence of the totally 6560 hospitalizations for myocardial infarction irrespective of the geographical region. Snow- and rainfall had inconsistent effects. Temperature, Foehn, and lightning showed no statistically significant impact. CONCLUSIONS: Ambient pressure, pressure gradient, and wind activity had a statistical impact on the incidence of myocardial infarctions in Switzerland from 1990 to 1994. To establish a cause-and-effect relationship more data are needed on the interaction between the pathophysiological mechanisms of the acute coronary syndrome and weather and climate variables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although long distance running clearly has benefits--as witnessed by its popularity--it also has risks of injury and death. Little is known, however, about the prevalence of potentially dangerous training habits in long distance runners, although anecdotal information suggests that many runners have erroneous beliefs about risks and benefits of marathon running. We conducted a cross-sectional survey to estimate the prevalence of 19 potentially dangerous training habits (risky behaviors) among marathon runners. A 66-item self-administered questionnaire was mailed to a stratified random sample of runners who finished of the 1992 Houston-Tenneco Marathon and were 21-71 years of age. Responses were obtained from 508 runners (83%) with approximately equal representation in four age-gender groups: men $<$40 years, men $\ge$40 years, women $<$40 years, and women $\ge$40 years.^ Prevalences of risky behaviors were high. 50% or more ran in dangerously hot and humid conditions, did not cool down or stretch after running, did not wear proper running gear, or ran when injured or ill; 25-49% did not warm up, ran on dangerous surfaces, did not drink sufficient water during training, increased weekly mileage too quickly, and ran during lightning storms; 10-24% ran daily, ran in areas with high pollution, ran in the same direction as traffic, did hard runs frequently, ran more than 60 miles per week, or ran against the advice of a physician.^ Positive associations were found between the practice of risky behaviors and self-reported prevalence of musculoskeletal injuries, heat-related injuries, noncompliance with recommendations for preventive health examinations, and noncompliance with positive health habits.^ These results indicate that many marathon runners engage in training habits that may increase risk of substantial injury or illness. Further studies are needed to explore the association of risky training behaviors on the incidence of injuries, and to determine reasons for noncompliance with recommendations from sports medicine specialists. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los objetivos propuestos en la investigación han posibilitado identificar y documentar las barreras de acceso y los factores de riesgo en los cuidados de salud en ginecología y obstetricia y evaluar la situación sociofamiliar en la población migrante instrumentando acciones de inclusión en el sistema de salud de la provincia en aquellas pacientes que requirieron una atención de mayor complejidad. La experiencia de trabajo interdisciplinario en una comunidad ha posibilitado estudiar a 99 mujeres en edad fértil, en un 45% de origen boliviano, con alta vulnerabilidad social que habitan en el Distrito Belgrano de Guaymallén. Se realizó un estudio protocolizado, descriptivo y observacional con entrevistas semiestructuradas y control ginecológico, con toma de muestras para Papanicolaou y colposcopía en un consultorio que se instaló en el jardín maternal. Se aplicó un consentimiento informado a todas las mujeres antes de la realización de los estudios. La lectura de las muestras y los estudios específicos incluyeron mamografías y ecografías que se realizaron en los servicios de anatomía patológica y rayos del Hospital Universitario. El equipo del Hospital Universitario que concurrió a terreno estuvo formado por ginecólogos, trabajadores sociales, enfermeros y alumnos de las carreras de grado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los versos 990-1008 de Suplicantes de Eurípides constituyen un verdadero locus desperatus: allí Evadne se presenta sobre la escena y expresa en versos líricos los sentimientos previos a su suicidio final. Ni la métrica sin responsio del pasaje, ni el texto evidentemente corrupto, ni la gramática inadecuada ayudan a comprender el sentido del pasaje. Tan así es que la mayoría de los editores ha renunciado a tratar de comprender el sentido de sus palabras. Sin embargo, creemos que la adecuada interpretación de la metáfora utilizada por la esposa de Capaneo ayudará a la comprensión del pasaje: se trata de un priamel que se constituye en patético recordatorio del día de la boda por parte de una mujer que está a punto de suicidarse sobre la tumba de su esposo ya muerto. El fuego del rayo que mató a Capaneo, así como el fuego de la pira en donde arde ahora su cadáver (y en la que ella misma se arrojará muy pronto), le suscitan una serie de recuerdos de días mejores, vinculados todos ellos con el fuego y con la luz. La mención del carro del sol y de la luna, que constituye una metáfora casi cristalizada en la literatura griega, permite desarrollar de manera novedosa otra metáfora: las muchachas que, también ellas, cabalgan sobre la oscuridad portando sus antorchas. Evadne destaca la alegría que reinó el día de su boda, para que sea más agudo el contraste con el momento presente

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los versos 990-1008 de Suplicantes de Eurípides constituyen un verdadero locus desperatus: allí Evadne se presenta sobre la escena y expresa en versos líricos los sentimientos previos a su suicidio final. Ni la métrica sin responsio del pasaje, ni el texto evidentemente corrupto, ni la gramática inadecuada ayudan a comprender el sentido del pasaje. Tan así es que la mayoría de los editores ha renunciado a tratar de comprender el sentido de sus palabras. Sin embargo, creemos que la adecuada interpretación de la metáfora utilizada por la esposa de Capaneo ayudará a la comprensión del pasaje: se trata de un priamel que se constituye en patético recordatorio del día de la boda por parte de una mujer que está a punto de suicidarse sobre la tumba de su esposo ya muerto. El fuego del rayo que mató a Capaneo, así como el fuego de la pira en donde arde ahora su cadáver (y en la que ella misma se arrojará muy pronto), le suscitan una serie de recuerdos de días mejores, vinculados todos ellos con el fuego y con la luz. La mención del carro del sol y de la luna, que constituye una metáfora casi cristalizada en la literatura griega, permite desarrollar de manera novedosa otra metáfora: las muchachas que, también ellas, cabalgan sobre la oscuridad portando sus antorchas. Evadne destaca la alegría que reinó el día de su boda, para que sea más agudo el contraste con el momento presente

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los versos 990-1008 de Suplicantes de Eurípides constituyen un verdadero locus desperatus: allí Evadne se presenta sobre la escena y expresa en versos líricos los sentimientos previos a su suicidio final. Ni la métrica sin responsio del pasaje, ni el texto evidentemente corrupto, ni la gramática inadecuada ayudan a comprender el sentido del pasaje. Tan así es que la mayoría de los editores ha renunciado a tratar de comprender el sentido de sus palabras. Sin embargo, creemos que la adecuada interpretación de la metáfora utilizada por la esposa de Capaneo ayudará a la comprensión del pasaje: se trata de un priamel que se constituye en patético recordatorio del día de la boda por parte de una mujer que está a punto de suicidarse sobre la tumba de su esposo ya muerto. El fuego del rayo que mató a Capaneo, así como el fuego de la pira en donde arde ahora su cadáver (y en la que ella misma se arrojará muy pronto), le suscitan una serie de recuerdos de días mejores, vinculados todos ellos con el fuego y con la luz. La mención del carro del sol y de la luna, que constituye una metáfora casi cristalizada en la literatura griega, permite desarrollar de manera novedosa otra metáfora: las muchachas que, también ellas, cabalgan sobre la oscuridad portando sus antorchas. Evadne destaca la alegría que reinó el día de su boda, para que sea más agudo el contraste con el momento presente

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Climatological Database for the World's Oceans: 1750-1854 (CLIWOC) project, which concluded in 2004, abstracted more than 280,000 daily weather observations from ships' logbooks from British, Dutch, French, and Spanish naval vessels engaged in imperial business in the eighteenth and nineteenth centuries. These data, now compiled into a database, provide valuable information for the reconstruction of oceanic wind field patterns for this key period that precedes the time in which anthropogenic influences on climate became evident. These reconstructions, in turn, provide evidence for such phenomena as the El Niño-Southern Oscillation and the North Atlantic Oscillation. Of equal importance is the finding that the CLIWOC database the first coordinated attempt to harness the scientific potential of this resource represents less than 10 percent of the volume of data currently known to reside in this important but hitherto neglected source.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The endemic Canary Island pine (Pinus canariensis) has an effective strategy to counteract fire disturbance in the short term. It has a mixed strategy that combines the presence of serotinous cones and thick barks with the ability to re-sprout from the trunk after a fire, a rare trait in pine species. High frequency of fires in the Canary Islands is related to human action, as natural fires by lightning or vulcan activity have very low frequency; hence, the how and whys of the presence of serotinous cones in the species is still a topic of debate. Previous studies showed that the frequency of serotinous cones varies from stand to stand. Here, we analyzed the presence of serotinous cones at a local scale. We selected a Canary Island pine stand in the transition zone between dry and humid forests in the south of Tenerife. Branches were pruned from 20 trees in order to evaluate the presence of serotinous vs. non-serotinous cones by direct verticile counting on the branches. The opening temperature of serotinous cones was assessed in the laboratory. Percentages of serotinous vs. non-serotinous cones varied from 0 to 93 %, showing high variability between trees. Opening temperatures were very high (above 65 ºC) as compared to other Mediterranean pine species with serotinous cones

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El interés cada vez mayor por las redes de sensores inalámbricos pueden ser entendido simplemente pensando en lo que esencialmente son: un gran número de pequeños nodos sensores autoalimentados que recogen información o detectan eventos especiales y se comunican de manera inalámbrica, con el objetivo final de entregar sus datos procesados a una estación base. Los nodos sensores están densamente desplegados dentro del área de interés, se pueden desplegar al azar y tienen capacidad de cooperación. Por lo general, estos dispositivos son pequeños y de bajo costo, de modo que pueden ser producidos y desplegados en gran numero aunque sus recursos en términos de energía, memoria, velocidad de cálculo y ancho de banda están enormemente limitados. Detección, tratamiento y comunicación son tres elementos clave cuya combinación en un pequeño dispositivo permite lograr un gran número de aplicaciones. Las redes de sensores proporcionan oportunidades sin fin, pero al mismo tiempo plantean retos formidables, tales como lograr el máximo rendimiento de una energía que es escasa y por lo general un recurso no renovable. Sin embargo, los recientes avances en la integración a gran escala, integrado de hardware de computación, comunicaciones, y en general, la convergencia de la informática y las comunicaciones, están haciendo de esta tecnología emergente una realidad. Del mismo modo, los avances en la nanotecnología están empezando a hacer que todo gire entorno a las redes de pequeños sensores y actuadores distribuidos. Hay diferentes tipos de sensores tales como sensores de presión, acelerómetros, cámaras, sensores térmicos o un simple micrófono. Supervisan las condiciones presentes en diferentes lugares tales como la temperatura, humedad, el movimiento, la luminosidad, presión, composición del suelo, los niveles de ruido, la presencia o ausencia de ciertos tipos de objetos, los niveles de tensión mecánica sobre objetos adheridos y las características momentáneas tales como la velocidad , la dirección y el tamaño de un objeto, etc. Se comprobara el estado de las Redes Inalámbricas de Sensores y se revisaran los protocolos más famosos. Así mismo, se examinara la identificación por radiofrecuencia (RFID) ya que se está convirtiendo en algo actual y su presencia importante. La RFID tiene un papel crucial que desempeñar en el futuro en el mundo de los negocios y los individuos por igual. El impacto mundial que ha tenido la identificación sin cables está ejerciendo fuertes presiones en la tecnología RFID, los servicios de investigación y desarrollo, desarrollo de normas, el cumplimiento de la seguridad y la privacidad y muchos más. Su potencial económico se ha demostrado en algunos países mientras que otros están simplemente en etapas de planificación o en etapas piloto, pero aun tiene que afianzarse o desarrollarse a través de la modernización de los modelos de negocio y aplicaciones para poder tener un mayor impacto en la sociedad. Las posibles aplicaciones de redes de sensores son de interés para la mayoría de campos. La monitorización ambiental, la guerra, la educación infantil, la vigilancia, la micro-cirugía y la agricultura son solo unos pocos ejemplos de los muchísimos campos en los que tienen cabida las redes mencionadas anteriormente. Estados Unidos de América es probablemente el país que más ha investigado en esta área por lo que veremos muchas soluciones propuestas provenientes de ese país. Universidades como Berkeley, UCLA (Universidad de California, Los Ángeles) Harvard y empresas como Intel lideran dichas investigaciones. Pero no solo EE.UU. usa e investiga las redes de sensores inalámbricos. La Universidad de Southampton, por ejemplo, está desarrollando una tecnología para monitorear el comportamiento de los glaciares mediante redes de sensores que contribuyen a la investigación fundamental en glaciología y de las redes de sensores inalámbricos. Así mismo, Coalesenses GmbH (Alemania) y Zurich ETH están trabajando en diversas aplicaciones para redes de sensores inalámbricos en numerosas áreas. Una solución española será la elegida para ser examinada más a fondo por ser innovadora, adaptable y polivalente. Este estudio del sensor se ha centrado principalmente en aplicaciones de tráfico, pero no se puede olvidar la lista de más de 50 aplicaciones diferentes que ha sido publicada por la firma creadora de este sensor específico. En la actualidad hay muchas tecnologías de vigilancia de vehículos, incluidos los sensores de bucle, cámaras de video, sensores de imagen, sensores infrarrojos, radares de microondas, GPS, etc. El rendimiento es aceptable, pero no suficiente, debido a su limitada cobertura y caros costos de implementación y mantenimiento, especialmente este ultimo. Tienen defectos tales como: línea de visión, baja exactitud, dependen mucho del ambiente y del clima, no se puede realizar trabajos de mantenimiento sin interrumpir las mediciones, la noche puede condicionar muchos de ellos, tienen altos costos de instalación y mantenimiento, etc. Por consiguiente, en las aplicaciones reales de circulación, los datos recibidos son insuficientes o malos en términos de tiempo real debido al escaso número de detectores y su costo. Con el aumento de vehículos en las redes viales urbanas las tecnologías de detección de vehículos se enfrentan a nuevas exigencias. Las redes de sensores inalámbricos son actualmente una de las tecnologías más avanzadas y una revolución en la detección de información remota y en las aplicaciones de recogida. Las perspectivas de aplicación en el sistema inteligente de transporte son muy amplias. Con este fin se ha desarrollado un programa de localización de objetivos y recuento utilizando una red de sensores binarios. Esto permite que el sensor necesite mucha menos energía durante la transmisión de información y que los dispositivos sean más independientes con el fin de tener un mejor control de tráfico. La aplicación se centra en la eficacia de la colaboración de los sensores en el seguimiento más que en los protocolos de comunicación utilizados por los nodos sensores. Las operaciones de salida y retorno en las vacaciones son un buen ejemplo de por qué es necesario llevar la cuenta de los coches en las carreteras. Para ello se ha desarrollado una simulación en Matlab con el objetivo localizar objetivos y contarlos con una red de sensores binarios. Dicho programa se podría implementar en el sensor que Libelium, la empresa creadora del sensor que se examinara concienzudamente, ha desarrollado. Esto permitiría que el aparato necesitase mucha menos energía durante la transmisión de información y los dispositivos sean más independientes. Los prometedores resultados obtenidos indican que los sensores de proximidad binarios pueden formar la base de una arquitectura robusta para la vigilancia de áreas amplias y para el seguimiento de objetivos. Cuando el movimiento de dichos objetivos es suficientemente suave, no tiene cambios bruscos de trayectoria, el algoritmo ClusterTrack proporciona un rendimiento excelente en términos de identificación y seguimiento de trayectorias los objetos designados como blancos. Este algoritmo podría, por supuesto, ser utilizado para numerosas aplicaciones y se podría seguir esta línea de trabajo para futuras investigaciones. No es sorprendente que las redes de sensores de binarios de proximidad hayan atraído mucha atención últimamente ya que, a pesar de la información mínima de un sensor de proximidad binario proporciona, las redes de este tipo pueden realizar un seguimiento de todo tipo de objetivos con la precisión suficiente. Abstract The increasing interest in wireless sensor networks can be promptly understood simply by thinking about what they essentially are: a large number of small sensing self-powered nodes which gather information or detect special events and communicate in a wireless fashion, with the end goal of handing their processed data to a base station. The sensor nodes are densely deployed inside the phenomenon, they deploy random and have cooperative capabilities. Usually these devices are small and inexpensive, so that they can be produced and deployed in large numbers, and so their resources in terms of energy, memory, computational speed and bandwidth are severely constrained. Sensing, processing and communication are three key elements whose combination in one tiny device gives rise to a vast number of applications. Sensor networks provide endless opportunities, but at the same time pose formidable challenges, such as the fact that energy is a scarce and usually non-renewable resource. However, recent advances in low power Very Large Scale Integration, embedded computing, communication hardware, and in general, the convergence of computing and communications, are making this emerging technology a reality. Likewise, advances in nanotechnology and Micro Electro-Mechanical Systems are pushing toward networks of tiny distributed sensors and actuators. There are different sensors such as pressure, accelerometer, camera, thermal, and microphone. They monitor conditions at different locations, such as temperature, humidity, vehicular movement, lightning condition, pressure, soil makeup, noise levels, the presence or absence of certain kinds of objects, mechanical stress levels on attached objects, the current characteristics such as speed, direction and size of an object, etc. The state of Wireless Sensor Networks will be checked and the most famous protocols reviewed. As Radio Frequency Identification (RFID) is becoming extremely present and important nowadays, it will be examined as well. RFID has a crucial role to play in business and for individuals alike going forward. The impact of ‘wireless’ identification is exerting strong pressures in RFID technology and services research and development, standards development, security compliance and privacy, and many more. The economic value is proven in some countries while others are just on the verge of planning or in pilot stages, but the wider spread of usage has yet to take hold or unfold through the modernisation of business models and applications. Possible applications of sensor networks are of interest to the most diverse fields. Environmental monitoring, warfare, child education, surveillance, micro-surgery, and agriculture are only a few examples. Some real hardware applications in the United States of America will be checked as it is probably the country that has investigated most in this area. Universities like Berkeley, UCLA (University of California, Los Angeles) Harvard and enterprises such as Intel are leading those investigations. But not just USA has been using and investigating wireless sensor networks. University of Southampton e.g. is to develop technology to monitor glacier behaviour using sensor networks contributing to fundamental research in glaciology and wireless sensor networks. Coalesenses GmbH (Germany) and ETH Zurich are working in applying wireless sensor networks in many different areas too. A Spanish solution will be the one examined more thoroughly for being innovative, adaptable and multipurpose. This study of the sensor has been focused mainly to traffic applications but it cannot be forgotten the more than 50 different application compilation that has been published by this specific sensor’s firm. Currently there are many vehicle surveillance technologies including loop sensors, video cameras, image sensors, infrared sensors, microwave radar, GPS, etc. The performance is acceptable but not sufficient because of their limited coverage and expensive costs of implementation and maintenance, specially the last one. They have defects such as: line-ofsight, low exactness, depending on environment and weather, cannot perform no-stop work whether daytime or night, high costs for installation and maintenance, etc. Consequently, in actual traffic applications the received data is insufficient or bad in terms of real-time owed to detector quantity and cost. With the increase of vehicle in urban road networks, the vehicle detection technologies are confronted with new requirements. Wireless sensor network is the state of the art technology and a revolution in remote information sensing and collection applications. It has broad prospect of application in intelligent transportation system. An application for target tracking and counting using a network of binary sensors has been developed. This would allow the appliance to spend much less energy when transmitting information and to make more independent devices in order to have a better traffic control. The application is focused on the efficacy of collaborative tracking rather than on the communication protocols used by the sensor nodes. Holiday crowds are a good case in which it is necessary to keep count of the cars on the roads. To this end a Matlab simulation has been produced for target tracking and counting using a network of binary sensors that e.g. could be implemented in Libelium’s solution. Libelium is the enterprise that has developed the sensor that will be deeply examined. This would allow the appliance to spend much less energy when transmitting information and to make more independent devices. The promising results obtained indicate that binary proximity sensors can form the basis for a robust architecture for wide area surveillance and tracking. When the target paths are smooth enough ClusterTrack particle filter algorithm gives excellent performance in terms of identifying and tracking different target trajectories. This algorithm could, of course, be used for different applications and that could be done in future researches. It is not surprising that binary proximity sensor networks have attracted a lot of attention lately. Despite the minimal information a binary proximity sensor provides, networks of these sensing modalities can track all kinds of different targets classes accurate enough.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El objetivo de esta tesis doctoral es la investigación del nuevo concepto de pinzas fotovoltaicas, es decir, del atrapamiento, ordenación y manipulación de partículas en las estructuras generadas en la superficie de materiales ferroeléctricos mediante campos fotovoltaicos o sus gradientes. Las pinzas fotovoltaicas son una herramienta prometedora para atrapar y mover las partículas en la superficie de un material fotovoltaico de una manera controlada. Para aprovechar esta nueva técnica es necesario conocer con precisión el campo eléctrico creado por una iluminación específica en la superficie del cristal y por encima de ella. Este objetivo se ha dividido en una serie de etapas que se describen a continuación. La primera etapa consistió en la modelización del campo fotovoltaico generado por iluminación no homogénea en substratos y guías de onda de acuerdo al modelo de un centro. En la segunda etapa se estudiaron los campos y fuerzas electroforéticas y dielectroforéticas que aparecen sobre la superficie de substratos iluminados inhomogéneamente. En la tercera etapa se estudiaron sus efectos sobre micropartículas y nanopartículas, en particular se estudió el atrapamiento superficial determinando las condiciones que permiten el aprovechamiento como pinzas fotovoltaicas. En la cuarta y última etapa se estudiaron las configuraciones más eficientes en cuanto a resolución espacial. Se trabajó con distintos patrones de iluminación inhomogénea, proponiéndose patrones de iluminación al equipo experimental. Para alcanzar estos objetivos se han desarrollado herramientas de cálculo con las cuales obtenemos temporalmente todas las magnitudes que intervienen en el problema. Con estas herramientas podemos abstraernos de los complicados mecanismos de atrapamiento y a partir de un patrón de luz obtener el atrapamiento. Todo el trabajo realizado se ha llevado a cabo en dos configuraciones del cristal, en corte X ( superficie de atrapamiento paralela al eje óptico) y corte Z ( superficie de atrapamiento perpendicular al eje óptico). Se ha profundizado en la interpretación de las diferencias en los resultados según la configuración del cristal. Todas las simulaciones y experimentos se han realizado utilizando como soporte un mismo material, el niobato de litio, LiNbO3, con el f n de facilitar la comparación de los resultados. Este hecho no ha supuesto una limitación en los resultados pues los modelos no se limitan a este material. Con respecto a la estructura del trabajo, este se divide en tres partes diferenciadas que son: la introducción (I), la modelización del atrapamiento electroforético y dielectroforético (II) y las simulaciones numéricas y comparación con experimentos (III). En la primera parte se fijan las bases sobre las que se sustentarán el resto de las partes. Se describen los efectos electromagnéticos y ópticos a los que se hará referencia en el resto de los capítulos, ya sea por ser necesarios para describir los experimentos o, en otros casos, para dejar constancia de la no aparición de estos efectos para el caso en que nos ocupa y justificar la simplificación que en muchos casos se hace del problema. En esta parte, se describe principalmente el atrapamiento electroforético y dielectroforético, el efecto fotovoltaico y las propiedades del niobato de litio por ser el material que utilizaremos en experimentos y simulaciones. Así mismo, como no debe faltar en ninguna investigación, se ha analizado el state of the art, revisando lo que otros científicos del campo en el que estamos trabajando han realizado y escrito con el fin de que nos sirva de cimiento a la investigación. Con el capítulo 3 finalizamos esta primera parte describiendo las técnicas experimentales que hoy en día se están utilizando en los laboratorios para realizar el atrapamiento de partículas mediante el efecto fotovoltaico, ya que obtendremos ligeras diferencias en los resultados según la técnica de atrapamiento que se utilice. En la parte I I , dedicada a la modelización del atrapamiento, empezaremos con el capítulo 4 donde modelizaremos el campo eléctrico interno de la muestra, para a continuación modelizar el campo eléctrico, los potenciales y las fuerzas externas a la muestra. En capítulo 5 presentaremos un modelo sencillo para comprender el problema que nos aborda, al que llamamos Modelo Estacionario de Separación de Carga. Este modelo da muy buenos resultados a pesar de su sencillez. Pasamos al capítulo 6 donde discretizaremos las ecuaciones que intervienen en la física interna de la muestra mediante el método de las diferencias finitas, desarrollando el Modelo de Distribución de Carga Espacial. Para terminar esta parte, en el capítulo 8 abordamos la programación de las modelizaciones presentadas en los anteriores capítulos con el fn de dotarnos de herramientas para realizar las simulaciones de una manera rápida. En la última parte, III, presentaremos los resultados de las simulaciones numéricas realizadas con las herramientas desarrolladas y comparemos sus resultados con los experimentales. Fácilmente podremos comparar los resultados en las dos configuraciones del cristal, en corte X y corte Z. Finalizaremos con un último capítulo dedicado a las conclusiones, donde resumiremos los resultados que se han ido obteniendo en cada apartado desarrollado y daremos una visión conjunta de la investigación realizada. ABSTRACT The aim of this thesis is the research of the new concept of photovoltaic or optoelectronic tweezers, i.e., trapping, management and manipulation of particles in structures generated by photovoltaic felds or gradients on the surface of ferroelectric materials. Photovoltaic tweezers are a promising tool to trap and move the particles on the surface of a photovoltaic material in a monitored way. To take advantage of this new technique is necessary to know accurately the electric field created by a specifc illumination in the crystal surface and above it. For this purpose, the work was divided into the stages described below. The first stage consisted of modeling the photovoltaic field generated by inhomogeneous illumination in substrates and waveguides according to the one-center model. In the second stage, electrophoretic and dielectrophoretic fields and forces appearing on the surface of substrates and waveguides illuminated inhomogeneously were studied. In the third stage, the study of its effects on microparticles and nanoparticles took place. In particular, the trapping surface was studied identifying the conditions that allow its use as photovoltaic tweezers. In the fourth and fnal stage the most efficient configurations in terms of spatial resolution were studied. Different patterns of inhomogeneous illumination were tested, proposing lightning patterns to the laboratory team. To achieve these objectives calculation tools were developed to get all magnitudes temporarily involved in the problem . With these tools, the complex mechanisms of trapping can be simplified, obtaining the trapping pattern from a light pattern. All research was carried out in two configurations of crystal; in X section (trapping surface parallel to the optical axis) and Z section (trapping surface perpendicular to the optical axis). The differences in the results depending on the configuration of the crystal were deeply studied. All simulations and experiments were made using the same material as support, lithium niobate, LiNbO3, to facilitate the comparison of results. This fact does not mean a limitation in the results since the models are not limited to this material. Regarding the structure of this work, it is divided into three clearly differentiated sections, namely: Introduction (I), Electrophoretic and Dielectrophoretic Capture Modeling (II) and Numerical Simulations and Comparison Experiments (III). The frst section sets the foundations on which the rest of the sections will be based on. Electromagnetic and optical effects that will be referred in the remaining chapters are described, either as being necessary to explain experiments or, in other cases, to note the non-appearance of these effects for the present case and justify the simplification of the problem that is made in many cases. This section mainly describes the electrophoretic and dielectrophoretic trapping, the photovoltaic effect and the properties of lithium niobate as the material to use in experiments and simulations. Likewise, as required in this kind of researches, the state of the art have been analyzed, reviewing what other scientists working in this field have made and written so that serve as a foundation for research. With chapter 3 the first section finalizes describing the experimental techniques that are currently being used in laboratories for trapping particles by the photovoltaic effect, because according to the trapping technique in use we will get slightly different results. The section I I , which is dedicated to the trapping modeling, begins with Chapter 4 where the internal electric field of the sample is modeled, to continue modeling the electric field, potential and forces that are external to the sample. Chapter 5 presents a simple model to understand the problem addressed by us, which is called Steady-State Charge Separation Model. This model gives very good results despite its simplicity. In chapter 6 the equations involved in the internal physics of the sample are discretized by the finite difference method, which is developed in the Spatial Charge Distribution Model. To end this section, chapter 8 is dedicated to program the models presented in the previous chapters in order to provide us with tools to perform simulations in a fast way. In the last section, III, the results of numerical simulations with the developed tools are presented and compared with the experimental results. We can easily compare outcomes in the two configurations of the crystal, in section X and section Z. The final chapter collects the conclusions, summarizing the results that were obtained in previous sections and giving an overview of the research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with a luminous electric discharge that forms in the mesospheric region between thundercloud tops and the ionosphere at 90-km altitude. These cloud–ionosphere discharges (CIs), following visual reports dating back to the 19th century, were finally imaged by a low-light TV camera as part of the “SKYFLASH” program at the University of Minnesota in 1989. Many observations were made by various groups in the period 1993–1996. The characteristics of CIs are that they have a wide range of sizes from a few kilometers up to 50 km horizontally; they extend from 40 km to nearly 90 km vertically, with an intense region near 60–70 km and streamers extending down toward cloud tops; the CIs are partly or entirely composed of vertical luminous filaments of kilometer size. The predominate color is red. The TV images show that the CIs usually have a duration less than one TV field (16.7 ms), but higher-speed photometric measurements show that they last about 3 ms, and are delayed 3 ms after an initiating cloud–ground lightning stroke; 95% of these initiating strokes are found to be “positive”—i.e., carry positive charges from clouds to ground. The preference for positive initiating strokes is not understood. Theories of the formation of CIs are briefly reviewed.