916 resultados para Wavelets and fast transform eavelet
Resumo:
Finite Difference Time Domain (FDTD) Method and software are applied to obtain diffraction waves from modulated Gaussian plane wave illumination for right angle wedges and Fast Fourier Transform (FFT) is used to get diffraction coefficients in a wideband in the illuminated lit region. Theta and Phi polarization in 3-dimensional, TM and TE polarization in 2-dimensional cases are considered respectively for soft and hard diffraction coefficients. Results using FDTD method of perfect electric conductor (PEC) wedge are compared with asymptotic expressions from Uniform Theory of Diffraction (UTD). Extend the PEC wedges to some homogenous conducting and dielectric building materials for diffraction coefficients that are not available analytically in practical conditions. ^
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
This dissertation develops an innovative approach towards less-constrained iris biometrics. Two major contributions are made in this research endeavor: (1) Designed an award-winning segmentation algorithm in the less-constrained environment where image acquisition is made of subjects on the move and taken under visible lighting conditions, and (2) Developed a pioneering iris biometrics method coupling segmentation and recognition of the iris based on video of moving persons under different acquisitions scenarios. The first part of the dissertation introduces a robust and fast segmentation approach using still images contained in the UBIRIS (version 2) noisy iris database. The results show accuracy estimated at 98% when using 500 randomly selected images from the UBIRIS.v2 partial database, and estimated at 97% in a Noisy Iris Challenge Evaluation (NICE.I) in an international competition that involved 97 participants worldwide involving 35 countries, ranking this research group in sixth position. This accuracy is achieved with a processing speed nearing real time. The second part of this dissertation presents an innovative segmentation and recognition approach using video-based iris images. Following the segmentation stage which delineates the iris region through a novel segmentation strategy, some pioneering experiments on the recognition stage of the less-constrained video iris biometrics have been accomplished. In the video-based and less-constrained iris recognition, the test or subject iris videos/images and the enrolled iris images are acquired with different acquisition systems. In the matching step, the verification/identification result was accomplished by comparing the similarity distance of encoded signature from test images with each of the signature dataset from the enrolled iris images. With the improvements gained, the results proved to be highly accurate under the unconstrained environment which is more challenging. This has led to a false acceptance rate (FAR) of 0% and a false rejection rate (FRR) of 17.64% for 85 tested users with 305 test images from the video, which shows great promise and high practical implications for iris biometrics research and system design.
Resumo:
In their discussion - Fast-Food Franchises: An Alternative Menu for Hotel/Casinos - by Skip Swerdlow, Assistant Professor of Finance, Larry Strate, Assistant Professor of Business Law, and Francis X. Brown, Assistant Professor of Hotel Administration at the University of Nevada, Las Vegas, their preview reads: Hotel/casino food service operations are adding some non-traditional fare to their daily offerings in the form of fast-food franchises. The authors review aspects of franchising and cite some new Las Vegas food ideas.” The authors offer that the statewide food and beverage figures, according to the Nevada Gaming Abstract of 1985, exceeded $1.24 billion. Most of that figure was generated in traditional coffee shops, gourmet dining rooms, and buffets. With that kind of food and beverage figure solidly on the table, it was inevitable that fast-food franchises would move into casinos to garner a share of the proceeds. In a March 1986 review of franchising, Restaurant Business reported the following statistics: “Over 60 percent of all restaurants are franchisee owned. This relationship is also paralleled in dollar sales, which has exceeded $53 billion.” “Restaurant franchising expansion has grown at an annual rate of 12 percent per year for the past five years.” The beginning of the article is dedicated to describing, in general, the franchise phenomenon; growth has been spectacular the authors inform you. “The franchise concept has provided an easy method of going into business for the entrepreneur with minimal business experience, but a desire to work hard to make a profit,” say professors Swerdlow, Strate, and Brown. Lured by tourist traffic, and the floundering Chapter 11 afflicted, Riviera Hotel and Casino in Las Vegas, Burger King saw an attractive opportunity for an experiment in non-traditional outlet placement, say the authors. Although innately transient, the tourist numbers were way too significant to ignore. That tourist traffic, the authors say, is ‘round-the-clock. Added to that figure is the 2000-3000 average employee count for many of the casinos on the ‘Vegas strip. Not surprisingly, the project began to look very appealing to both Burger King and the Riviera Hotel/Casino, the authors report. In the final analysis, the project did work out well; very well indeed. So it is written, “The successful operation of the Burger King in the Riviera has sparked interest by other existing hotel/casino operations and fast-food restaurant chains. Burger King's operation, like so many other industry leadership decisions, provides impetus for healthy competition in a market that is burgeoning not only because of expansion that recognizes traditional population growth, but because of bold moves that search for customers in non-traditional areas.” The authors provide an Appendix listing Las Vegas hotel/casino properties and the restaurants they contain.
Resumo:
The aim of this study was to develop a practical, versatile and fast dosimetry and radiobiological model for calculation of the 3D dose distribution and radiobiological effectiveness of radioactive stents. The algorithm was written in Matlab 6.5 programming language and is based on the dose point kernel convolution. The dosimetry and radiobiological model was applied for evaluation of the 3D dose distribution of 32P, 90Y, 188Re and 177Lu stents. Of the four, 32P delivers the highest dose, while 90Y, 188Re and 177Lu require high levels of activity to deliver a significant therapeutic dose in the range of 15-30 Gy. Results of the radiobiological model demonstrated that the same physical dose delivered by different radioisotopes produces significantly different radiobiological effects. This type of theoretical dose calculation can be useful in the development of new stent designs, the planning of animal studies and clinical trials, and clinical decisions involving individualized treatment plans.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.
Resumo:
We investigated controls on the water chemistry of a South Ecuadorian cloud forest catchment which is partly pristine, and partly converted to extensive pasture. From April 2007 to May 2008 water samples were taken weekly to biweekly at nine different subcatchments, and were screened for differences in electric conductivity, pH, anion, as well as element composition. A principal component analysis was conducted to reduce dimensionality of the data set and define major factors explaining variation in the data. Three main factors were isolated by a subset of 10 elements (Ca2+, Ce, Gd, K+, Mg2+, Na+, Nd, Rb, Sr, Y), explaining around 90% of the data variation. Land-use was the major factor controlling and changing water chemistry of the subcatchments. A second factor was associated with the concentration of rare earth elements in water, presumably highlighting other anthropogenic influences such as gravel excavation or road construction. Around 12% of the variation was explained by the third component, which was defined by the occurrence of Rb and K and represents the influence of vegetation dynamics on element accumulation and wash-out. Comparison of base- and fast flow concentrations led to the assumption that a significant portion of soil water from around 30 cm depth contributes to storm flow, as revealed by increased rare earth element concentrations in fast flow samples. Our findings demonstrate the utility of multi-tracer principal component analysis to study tropical headwater streams, and emphasize the need for effective land management in cloud forest catchments.
Resumo:
This thesis reports on a novel method to build a 3-D model of the above-water portion of icebergs using surface imaging. The goal is to work towards the automation of iceberg surveys, allowing an Autonomous Surface Craft (ASC) to acquire shape and size information. After collecting data and images, the core software algorithm is made up of three parts: occluding contour finding, volume intersection, and parameter estimation. A software module is designed that could be used on the ASC to perform automatic and fast processing of above-water surface image data to determine iceberg shape and size measurement and determination. The resolution of the method is calculated using data from the iceberg database of the Program of Energy Research and Development (PERD). The method was investigated using data from field trials conducted through the summer of 2014 by surveying 8 icebergs during 3 expeditions. The results were analyzed to determine iceberg characteristics. Limitations of this method are addressed including its accuracy. Surface imaging system and LIDAR system are developed to profile the above-water iceberg in 2015.
Resumo:
Coral reefs are increasingly threatened by global and local anthropogenic stressors, such as rising seawater temperature and nutrient enrichment. These two stressors vary widely across the reef face and parsing out their influence on coral communities at reef system scales has been particularly challenging. Here, we investigate the influence of temperature and nutrients on coral community traits and life history strategies on lagoonal reefs across the Belize Mesoamerican Barrier Reef System (MBRS). A novel metric was developed using ultra-high-resolution sea surface temperatures (SST) to classify reefs as enduring low (lowTP), moderate (modTP), or extreme (extTP) temperature parameters over 10 years (2003 to 2012). Chlorophyll-a (chl a) records obtained for the same interval were employed as a proxy for bulk nutrients and these records were complemented with in situ measurements to "sea truth" nutrient content across the three reef types. Chl a concentrations were highest at extTP sites, medial at modTP sites and lowest at lowTP sites. Coral species richness, abundance, diversity, density, and percent cover were lower at extTP sites compared to lowTP and modTP sites, but these reef community traits did not differ between lowTP and modTP sites. Coral life history strategy analyses showed that extTP sites were dominated by hardy stress-tolerant and fast-growing weedy coral species, while lowTP and modTP sites consisted of competitive, generalist, weedy, and stress-tolerant coral species. These results suggest that differences in coral community traits and life history strategies between extTP and lowTP/modTP sites were driven primarily by temperature differences with differences in nutrients across site types playing a lesser role. Dominance of weedy and stress-tolerant genera at extTP sites suggests that corals utilizing these two life history strategies may be better suited to cope with warmer oceans and thus may warrant further protective status during this climate change interval.
Data associated with this project are archived here, including:
-SST data
-Satellite Chl a data
-Nutrient measurements
-Raw coral community survey data
For questions contact Justin Baumann (j.baumann3
Resumo:
Purpose: The purpose of this paper is to ascertain how today’s international marketers can perform better on the global scene by harnessing spontaneity. Design/methodology/approach: The authors draw on contingency theory to develop a model of the spontaneity – international marketing performance relationship, and identify three potential moderators, namely, strategic planning, centralization, and market dynamism. The authors test the model via structural equation modeling with survey data from 197 UK exporters. Findings: The results indicate that spontaneity is beneficial to exporters in terms of enhancing profit performance. In addition, greater centralization and strategic planning strengthen the positive effects of spontaneity. However, market dynamism mitigates the positive effect of spontaneity on export performance (when customer needs are volatile, spontaneous decisions do not function as well in terms of ensuring success). Practical implications: Learning to be spontaneous when making export decisions appears to result in favorable outcomes for the export function. To harness spontaneity, export managers should look to develop company heuristics (increase centralization and strategic planning). Finally, if operating in dynamic export market environments, the role of spontaneity is weaker, so more conventional decision-making approaches should be adopted. Originality/value: The international marketing environment typically requires decisions to be flexible and fast. In this context, spontaneity could enable accelerated and responsive decision-making, allowing international marketers to realize superior performance. Yet, there is a lack of research on decision-making spontaneity and its potential for international marketing performance enhancement.
Resumo:
The cold-water coral Lophelia pertusa is one of the few species able to build reef-like structures and a 3-dimensional coral framework in the deep oceans. Furthermore, deep cold-water coral bioherms may be among the first marine ecosystems to be affected by ocean acidification. Colonies of L. pertusa were collected during a cruise in 2006 to cold-water coral bioherms of the Mingulay reef complex (Hebrides, North Atlantic). Shortly after sample collection onboard these corals were labelled with calcium-45. The same experimental approach was used to assess calcification rates and how those changed due to reduced pH during a cruise to the Skagerrak (North Sea) in 2007. The highest calcification rates were found in youngest polyps with up to 1% d-1 new skeletal growth and average rates of 0.11±0.02% d-1±S.E.). Lowering pH by 0.15 and 0.3 units relative to the ambient level resulted in calcification being reduced by 30 and 56%. Lower pH reduced calcification more in fast growing, young polyps (59% reduction) than in older polyps (40% reduction). Thus skeletal growth of young and fast calcifying corallites suffered more from ocean acidification. Nevertheless, L. pertusa exhibited positive net calcification (as measured by 45Ca incorporation) even at an aragonite saturation state below 1.
Resumo:
A more than two-decadal sediment trap record from the Eastern Boundary Upwelling Ecosystem (EBUE) off Cape Blanc, Mauritania, is analysed with respect to deep ocean mass fluxes, flux components and their variability on seasonal to decadal timescales. The total mass flux revealed interannual fluctuations which were superimposed by fluctuations on decadal timescales. High winter fluxes of biogenic silica (BSi), used as a measure of marine production (mostly by diatoms) largely correspond to a positive North Atlantic Oscillation (NAO) index (December-March). However, this relationship is weak. The highest positive BSi anomaly was in winter 2004-2005 when the NAO was in a neutral state. More episodic BSi sedimentation events occurred in several summer seasons between 2001 and 2005, when the previous winter NAO was neutral or even negative. We suggest that distinct dust outbreaks and deposition in the surface ocean in winter and occasionally in summer/autumn enhanced particle sedimentation and carbon export on short timescales via the ballasting effect. Episodic perturbations of the marine carbon cycle by dust outbreaks (e.g. in 2005) might have weakened the relationships between fluxes and large-scale climatic oscillations. As phytoplankton biomass is high throughout the year, any dry (in winter) or wet (in summer) deposition of fine-grained dust particles is assumed to enhance the efficiency of the biological pump by incorporating dust into dense and fast settling organic-rich aggregates. A good correspondence between BSi and dust fluxes was observed for the dusty year 2005, following a period of rather dry conditions in the Sahara/Sahel region. Large changes of all bulk fluxes occurred during the strongest El Niño-Southern Oscillation (ENSO) in 1997-1999 where low fluxes were obtained for almost 1 year during the warm El Niño and high fluxes in the following cold La Niña phase. For decadal timescales, Bakun (1990) suggested an intensification of coastal upwelling due to increased winds (''Bakun upwelling intensification hypothesis''; Cropper et al., 2014) and global climate change. We did not observe an increase of any flux component off Cape Blanc during the past 2 and a half decades which might support this. Furthermore, fluxes of mineral dust did not show any positive or negative trends over time which might suggest enhanced desertification or ''Saharan greening'' during the last few decades.
Resumo:
Thermal and fatigue cracking are the two of the major pavement distress phenomena that contribute significantly towards increased premature pavement failures in Ontario. This in turn puts a massive burden on the provincial budgets as the government spends huge sums of money on the repair and rehabilitation of roads every year. Governments therefore need to rethink and re-evaluate their current measures in order to prevent it in future. The main objectives of this study include: the investigation of fatigue distress of 11 contract samples at 10oC, 15oC, 20oC and 25oC and the use of crack-tip-opening-displacement (CTOD) requirements at temperatures other than 15oC; investigation of thermal and fatigue distress of the comparative analysis of 8 Ministry of Transportation (MTO) recovered and straight asphalt samples through double-edge-notched-tension test (DENT) and extended bending beam rheometry (EBBR); chemical testing of all samples though X-ray Fluorescence (XRF) and Fourier transform infrared analysis (FTIR); Dynamic Shear Rheometer (DSR) higher and intermediate temperature grading; and the case study of a local Kingston road. Majority of 11 contract samples showed satisfactory performance at all temperatures except one sample. Study of CTOD at various temperatures found a strong correlation between the two variables. All recovered samples showed poor performance in terms of their ability to resist thermal and fatigue distress relative to their corresponding straight asphalt as evident in DENT test and EBBR results. XRF and FTIR testing of all samples showed the addition of waste engine oil (WEO) to be the root cause of pavement failures. DSR high temperature grading showed superior performance of recovered binders relative to straight asphalt. The local Kingston road showed extensive signs of damage due to thermal and fatigue distress as evident from DENT test, EBBR results and pictures taken in the field. In the light of these facts, the use of waste engine oil and recycled asphalt in pavements should be avoided as these have been shown to cause premature failure in pavements. The DENT test existing CTOD requirements should be implemented at other temperatures in order to prevent the occurrences of premature pavement failures in future.
Resumo:
Field-programmable gate arrays are ideal hosts to custom accelerators for signal, image, and data processing but de- mand manual register transfer level design if high performance and low cost are desired. High-level synthesis reduces this design burden but requires manual design of complex on-chip and off-chip memory architectures, a major limitation in applications such as video processing. This paper presents an approach to resolve this shortcoming. A constructive process is described that can derive such accelerators, including on- and off-chip memory storage from a C description such that a user-defined throughput constraint is met. By employing a novel statement-oriented approach, dataflow intermediate models are derived and used to support simple ap- proaches for on-/off-chip buffer partitioning, derivation of custom on-chip memory hierarchies and architecture transformation to ensure user-defined throughput constraints are met with minimum cost. When applied to accelerators for full search motion estima- tion, matrix multiplication, Sobel edge detection, and fast Fourier transform, it is shown how real-time performance up to an order of magnitude in advance of existing commercial HLS tools is enabled whilst including all requisite memory infrastructure. Further, op- timizations are presented that reduce the on-chip buffer capacity and physical resource cost by up to 96% and 75%, respectively, whilst maintaining real-time performance.
Resumo:
The main objective of this work was to develop a novel dimensionality reduction technique as a part of an integrated pattern recognition solution capable of identifying adulterants such as hazelnut oil in extra virgin olive oil at low percentages based on spectroscopic chemical fingerprints. A novel Continuous Locality Preserving Projections (CLPP) technique is proposed which allows the modelling of the continuous nature of the produced in-house admixtures as data series instead of discrete points. The maintenance of the continuous structure of the data manifold enables the better visualisation of this examined classification problem and facilitates the more accurate utilisation of the manifold for detecting the adulterants. The performance of the proposed technique is validated with two different spectroscopic techniques (Raman and Fourier transform infrared, FT-IR). In all cases studied, CLPP accompanied by k-Nearest Neighbors (kNN) algorithm was found to outperform any other state-of-the-art pattern recognition techniques.