920 resultados para applications in subject areas


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thunderstorm is a dangerous electrical phenomena in the atmosphere. Thundercloud is formed when thermal energy is transported rapidly upwards in convective updraughts. Electrification occurs in the collisions of cloud particles in the strong updraught. When the amount of charge in the cloud is large enough, electrical breakdown, better known as a flash, occurs. Lightning location is nowadays an essential tool for the detection of severe weather. Located flashes indicate in real time the movement of hazardous areas and the intensity of lightning activity. Also, an estimate for the flash peak current can be determined. The observations can be used in damage surveys. The most simple way to represent lightning data is to plot the locations on a map, but the data can be processed in more complex end-products and exploited in data fusion. Lightning data serves as an important tool also in the research of lightning-related phenomena, such as Transient Luminous Events. Most of the global thunderstorms occur in areas with plenty of heat, moisture and tropospheric instability, for example in the tropical land areas. In higher latitudes like in Finland, the thunderstorm season is practically restricted to the summer season. Particular feature of the high-latitude climatology is the large annual variation, which regards also thunderstorms. Knowing the performance of any measuring device is important because it affects the accuracy of the end-products. In lightning location systems, the detection efficiency means the ratio between located and actually occurred flashes. Because in practice it is impossible to know the true number of actually occurred flashes, the detection efficiency has to be esimated with theoretical methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increased boating activities and new waterfront developments have contributed an estimated 3,000 dismantled, abandoned, junked, wrecked, derelict vessels to Florida coastal waters. This report outlines a method of siting and prioritizing derelict vessel removal using the Florida Keys as a test area. The data base was information on 240 vessels, obtained from Florida Marine Patrol files. Vessel location was plotted on 1:250,000 regional and 1:5,000 and 1:12,000 site maps. Type of vessel, length, hull material, engine, fuel tanks, overall condition, afloat and submerged characteristics, and accessibility, were used to derive parametric site indices of removal priority and removal difficulty. Results indicate 59 top priority cases which should be the focus of immediate clean up efforts in the Florida Keys. Half of these cases are rated low to moderate in removal difficulty; the remainder are difficult to remove. Removal difficulty is a surrogate for removal cost: low difficulty -low cost, high difficulty - high cost. The rating scheme offers coastal planners options of focusing removal operations either on (1) specific areas with clusters of high priority derelict vessels or on (2) selected targeted derelicts at various, specific locations. (PDF has 59 pages.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adaptive cluster sampling (ACS) has been the subject of many publications about sampling aggregated populations. Choosing the criterion value that invokes ACS remains problematic. We address this problem using data from a June 1999 ACS survey for rockfish, specifically for Pacific ocean perch (Sebastes alutus), and for shortraker (S. borealis) and rougheye (S. aleutianus) rockfish combined. Our hypotheses were that ACS would outperform simple random sampling (SRS) for S. alutus and would be more applicable for S. alutus than for S. borealis and S. aleutianus combined because populations of S. alutus are thought to be more aggregated. Three alternatives for choosing a criterion value were investigated. We chose the strategy that yielded the lowest criterion value and simulated the higher criterion values with the data after the survey. Systematic random sampling was conducted across the whole area to determine the lowest criterion value, and then a new systematic random sample was taken with adaptive sampling around each tow that exceeded the fixed criterion value. ACS yielded gains in precision (SE) over SRS. Bootstrapping showed that the distribution of an ACS estimator is approximately normal, whereas the SRS sampling distribution is skewed and bimodal. Simulation showed that a higher criterion value results in substantially less adaptive sampling with little tradeoff in precision. When time-efficiency was examined, ACS quickly added more samples, but sampling edge units caused this efficiency to be lessened, and the gain in efficiency did not measurably affect our conclusions. ACS for S. alutus should be incorporated with a fixed criterion value equal to the top quartile of previously collected survey data. The second hypothesis was confirmed because ACS did not prove to be more effective for S. borealis-S. aleutianus. Overall, our ACS results were not as optimistic as those previously published in the literature, and indicate the need for further study of this sampling method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

R. Zwiggelaar, C.R. Bull, and M.J. Mooney, 'X-ray simulations for imaging applications in the agricultural and food industry', Journal of Agricultural Engineering Research 63(2), 161-170 (1996)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this paper is to demonstrate an approach to characterize the spatial variability in ambient air concentrations using mobile platform measurements. This approach may be useful for air toxics assessments in Environmental Justice applications, epidemiological studies, and environmental health risk assessments. In this study, we developed and applied a method to characterize air toxics concentrations in urban areas using results of the recently conducted field study in Wilmington, DE. Mobile measurements were collected over a 4- x 4-km area of downtown Wilmington for three components: formaldehyde (representative of volatile organic compounds and also photochemically reactive pollutants), aerosol size distribution (representing fine particulate matter), and water-soluble hexavalent chromium (representative of toxic metals). These measurements were,used to construct spatial and temporal distributions of air toxics in the area that show a very strong temporal variability, both diurnally and seasonally. An analysis of spatial variability indicates that all pollutants varied significantly by location, which suggests potential impact of local sources. From the comparison with measurements at the central monitoring site, we conclude that formaldehyde and fine particulates show a positive correlation with temperature, which could also be the reason that photochemically generated formaldehyde and fine particulates over the study area correlate well with the fine particulate matter measured at the central site.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Au nanoparticles (AuNPs) have attracted a great interest in fabrication of various biosensor systems for analysis of cellular and biomolecular recognitions. In conjunction with vast conjugation chemistry available, the materials are easily coupled with biomolecules such as nucleic acids, antigens or antibodies in order to achieve their many potential applications as ligand carriers or transducing platforms for preparation, detection and quantification purposes. Furthermore, the nanoparticles possess easily tuned and unique optical/ physical/ chemical characteristics, and high surface areas, making them ideal candidates to this end. In this topic, sensing mechanisms based on localized surface plasmon resonance (LSPR), particle aggregation, catalytic property, and Fluorescence Resonance Energy Transfer (FRET) of AuNPs as well as barcoding technologies including DNA biobarcodes will be discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les décisions de localisation sont souvent soumises à des aspects dynamiques comme des changements dans la demande des clients. Pour y répondre, la solution consiste à considérer une flexibilité accrue concernant l’emplacement et la capacité des installations. Même lorsque la demande est prévisible, trouver le planning optimal pour le déploiement et l'ajustement dynamique des capacités reste un défi. Dans cette thèse, nous nous concentrons sur des problèmes de localisation avec périodes multiples, et permettant l'ajustement dynamique des capacités, en particulier ceux avec des structures de coûts complexes. Nous étudions ces problèmes sous différents points de vue de recherche opérationnelle, en présentant et en comparant plusieurs modèles de programmation linéaire en nombres entiers (PLNE), l'évaluation de leur utilisation dans la pratique et en développant des algorithmes de résolution efficaces. Cette thèse est divisée en quatre parties. Tout d’abord, nous présentons le contexte industriel à l’origine de nos travaux: une compagnie forestière qui a besoin de localiser des campements pour accueillir les travailleurs forestiers. Nous présentons un modèle PLNE permettant la construction de nouveaux campements, l’extension, le déplacement et la fermeture temporaire partielle des campements existants. Ce modèle utilise des contraintes de capacité particulières, ainsi qu’une structure de coût à économie d’échelle sur plusieurs niveaux. L'utilité du modèle est évaluée par deux études de cas. La deuxième partie introduit le problème dynamique de localisation avec des capacités modulaires généralisées. Le modèle généralise plusieurs problèmes dynamiques de localisation et fournit de meilleures bornes de la relaxation linéaire que leurs formulations spécialisées. Le modèle peut résoudre des problèmes de localisation où les coûts pour les changements de capacité sont définis pour toutes les paires de niveaux de capacité, comme c'est le cas dans le problème industriel mentionnée ci-dessus. Il est appliqué à trois cas particuliers: l'expansion et la réduction des capacités, la fermeture temporaire des installations, et la combinaison des deux. Nous démontrons des relations de dominance entre notre formulation et les modèles existants pour les cas particuliers. Des expériences de calcul sur un grand nombre d’instances générées aléatoirement jusqu’à 100 installations et 1000 clients, montrent que notre modèle peut obtenir des solutions optimales plus rapidement que les formulations spécialisées existantes. Compte tenu de la complexité des modèles précédents pour les grandes instances, la troisième partie de la thèse propose des heuristiques lagrangiennes. Basées sur les méthodes du sous-gradient et des faisceaux, elles trouvent des solutions de bonne qualité même pour les instances de grande taille comportant jusqu’à 250 installations et 1000 clients. Nous améliorons ensuite la qualité de la solution obtenue en résolvent un modèle PLNE restreint qui tire parti des informations recueillies lors de la résolution du dual lagrangien. Les résultats des calculs montrent que les heuristiques donnent rapidement des solutions de bonne qualité, même pour les instances où les solveurs génériques ne trouvent pas de solutions réalisables. Finalement, nous adaptons les heuristiques précédentes pour résoudre le problème industriel. Deux relaxations différentes sont proposées et comparées. Des extensions des concepts précédents sont présentées afin d'assurer une résolution fiable en un temps raisonnable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discovery of the Photoacoustic (PA) effect was a remarkable achievement and was relegated to the scientific footnotes of the nineteenth century. However, after the advent of lasers and sophisticated electronics this effect was rediscovered and it has established itself as an important research and analytical tool in numerous areas, including physics, chemistry, biology and medicine. Quite recently, this phenomenon has made its impact in the field of laser technology for applications such as the developments of highly efficient active media for lasers, high quality optics and sensitive laser power monitoring devices. This thesis presents the work carried out by the author in this field during the past few years at the Department of Physics in Cochin University of Science and Technology. The studies discussed here are mostly based on the development of a sensitive PA laser power meter and its various applications using different laser systems available in the laboratory. This includes the development of a current regulated CW C0 laser and its application in material processing. The thesis contains seven chapters which by and large are self contained with separate abstracts and references. The first chapter which is divided into two parts presents an introduction to the PA effect and its present status. Part A reviews the basic theory of laser and gives a sum mary of various lasers and their applications. Part B presents a brief description of PA effect and its suitability as a spectroscopic tool followed by its applications to various branches of science and technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This book is a collection of articles devoted to the theory of linear operators in Hilbert spaces and its applications. The subjects covered range from the abstract theory of Toeplitz operators to the analysis of very specific differential operators arising in quantum mechanics, electromagnetism, and the theory of elasticity; the stability of numerical methods is also discussed. Many of the articles deal with spectral problems for not necessarily selfadjoint operators. Some of the articles are surveys outlining the current state of the subject and presenting open problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this book is to present the quantitative techniques that are commonly employed in empirical finance research together with real world, state of the art research examples. Each chapter is written by international experts in their fields. The unique approach is to describe a question or issue in finance and then to demonstrate the methodologies that may be used to solve it. All of the techniques described are used to address real problems rather than being presented for their own sake and the areas of application have been carefully selected so that a broad range of methodological approaches can be covered. This book is aimed primarily at doctoral researchers and academics who are engaged in conducting original empirical research in finance. In addition, the book will be useful to researchers in the financial markets and also advanced Masters-level students who are writing dissertations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The solar and longwave environmental irradiance geometry (SOLWEIG) model simulates spatial variations of 3-D radiation fluxes and mean radiant temperature (T mrt) as well as shadow patterns in complex urban settings. In this paper, a new vegetation scheme is included in SOLWEIG and evaluated. The new shadow casting algorithm for complex vegetation structures makes it possible to obtain continuous images of shadow patterns and sky view factors taking both buildings and vegetation into account. For the calculation of 3-D radiation fluxes and T mrt, SOLWEIG only requires a limited number of inputs, such as global shortwave radiation, air temperature, relative humidity, geographical information (latitude, longitude and elevation) and urban geometry represented by high-resolution ground and building digital elevation models (DEM). Trees and bushes are represented by separate DEMs. The model is evaluated using 5 days of integral radiation measurements at two sites within a square surrounded by low-rise buildings and vegetation in Göteborg, Sweden (57°N). There is good agreement between modelled and observed values of T mrt, with an overall correspondence of R 2 = 0.91 (p < 0.01, RMSE = 3.1 K). A small overestimation of T mrt is found at locations shadowed by vegetation. Given this good performance a number of suggestions for future development are identified for applications which include for human comfort, building design, planning and evaluation of instrument exposure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a novel mobile sink area allocation scheme for consumer based mobile robotic devices with a proven application to robotic vacuum cleaners. In the home or office environment, rooms are physically separated by walls and an automated robotic cleaner cannot make a decision about which room to move to and perform the cleaning task. Likewise, state of the art cleaning robots do not move to other rooms without direct human interference. In a smart home monitoring system, sensor nodes may be deployed to monitor each separate room. In this work, a quad tree based data gathering scheme is proposed whereby the mobile sink physically moves through every room and logically links all separated sub-networks together. The proposed scheme sequentially collects data from the monitoring environment and transmits the information back to a base station. According to the sensor nodes information, the base station can command a cleaning robot to move to a specific location in the home environment. The quad tree based data gathering scheme minimizes the data gathering tour length and time through the efficient allocation of data gathering areas. A calculated shortest path data gathering tour can efficiently be allocated to the robotic cleaner to complete the cleaning task within a minimum time period. Simulation results show that the proposed scheme can effectively allocate and control the cleaning area to the robot vacuum cleaner without any direct interference from the consumer. The performance of the proposed scheme is then validated with a set of practical sequential data gathering tours in a typical office/home environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ranking is an important task for handling a large amount of content. Ideally, training data for supervised ranking would include a complete rank of documents (or other objects such as images or videos) for a particular query. However, this is only possible for small sets of documents. In practice, one often resorts to document rating, in that a subset of documents is assigned with a small number indicating the degree of relevance. This poses a general problem of modelling and learning rank data with ties. In this paper, we propose a probabilistic generative model, that models the process as permutations over partitions. This results in super-exponential combinatorial state space with unknown numbers of partitions and unknown ordering among them. We approach the problem from the discrete choice theory, where subsets are chosen in a stagewise manner, reducing the state space per each stage significantly. Further, we show that with suitable parameterisation, we can still learn the models in linear time. We evaluate the proposed models on two application areas: (i) document ranking with the data from the recently held Yahoo! challenge, and (ii) collaborative filtering with movie data. The results demonstrate that the models are competitive against well-known rivals.