945 resultados para Kautz filters


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A digital differentiator simply involves the derivation of an input signal. This work includes the presentation of first-degree and second-degree differentiators, which are designed as both infinite-impulse-response (IIR) filters and finite-impulse-response (FIR) filters. The proposed differentiators have low-pass magnitude response characteristics, thereby rejecting noise frequencies higher than the cut-off frequency. Both steady-state frequency-domain characteristics and Time-domain analyses are given for the proposed differentiators. It is shown that the proposed differentiators perform well when compared to previously proposed filters. When considering the time-domain characteristics of the differentiators, the processing of quantized signals proved especially enlightening, in terms of the filtering effects of the proposed differentiators. The coefficients of the proposed differentiators are obtained using an optimization algorithm, while the optimization objectives include magnitude and phase response. The low-pass characteristic of the proposed differentiators is achieved by minimizing the filter variance. The low-pass differentiators designed show the steep roll-off, as well as having highly accurate magnitude response in the pass-band. While having a history of over three hundred years, the design of fractional differentiator has become a ‘hot topic’ in recent decades. One challenging problem in this area is that there are many different definitions to describe the fractional model, such as the Riemann-Liouville and Caputo definitions. Through use of a feedback structure, based on the Riemann-Liouville definition. It is shown that the performance of the fractional differentiator can be improved in both the frequency-domain and time-domain. Two applications based on the proposed differentiators are described in the thesis. Specifically, the first of these involves the application of second degree differentiators in the estimation of the frequency components of a power system. The second example concerns for an image processing, edge detection application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Photonic integration has become an important research topic in research for applications in the telecommunications industry. Current optical internet infrastructure has reached capacity with current generation dense wavelength division multiplexing (DWDM) systems fully occupying the low absorption region of optical fibre from 1530 nm to 1625 nm (the C and L bands). This is both due to an increase in the number of users worldwide and existing users demanding more bandwidth. Therefore, current research is focussed on using the available telecommunication spectrum more efficiently. To this end, coherent communication systems are being developed. Advanced coherent modulation schemes can be quite complex in terms of the number and array of devices required for implementation. In order to make these systems viable both logistically and commercially, photonic integration is required. In traditional DWDM systems, arrayed waveguide gratings (AWG) are used to both multiplex and demultiplex the multi-wavelength signal involved. AWGs are used widely as they allow filtering of the many DWDM wavelengths simultaneously. However, when moving to coherent telecommunication systems such as coherent optical frequency division multiplexing (OFDM) smaller FSR ranges are required from the AWG. This increases the size of the device which is counter to the miniaturisation which integration is trying to achieve. Much work was done with active filters during the 1980s. This involved using a laser device (usually below threshold) to allow selective wavelength filtering of input signals. By using more complicated cavity geometry devices such as distributed feedback (DFB) and sampled grating distributed Bragg gratings (SG-DBR) narrowband filtering is achievable with high suppression (>30 dB) of spurious wavelengths. The active nature of the devices also means that, through carrier injection, the index can be altered resulting in tunability of the filter. Used above threshold, active filters become useful in filtering coherent combs. Through injection locking, the coherence of the filtered wavelengths with the original comb source is retained. This gives active filters potential application in coherent communication system as demultiplexers. This work will focus on the use of slotted Fabry-Pérot (SFP) semiconductor lasers as active filters. Experiments were carried out to ensure that SFP lasers were useful as tunable active filters. In all experiments in this work the SFP lasers were operated above threshold and so injection locking was the mechanic by which the filters operated. Performance of the lasers under injection locking was examined using both single wavelength and coherent comb injection. In another experiment two discrete SFP lasers were used simultaneously to demultiplex a two-line coherent comb. The relative coherence of the comb lines was retained after demultiplexing. After showing that SFP lasers could be used to successfully demultiplex coherent combs a photonic integrated circuit was designed and fabricated. This involved monolithic integration of a MMI power splitter with an array of single facet SFP lasers. This device was tested much in the same way as the discrete devices. The integrated device was used to successfully demultiplex a two line coherent comb signal whilst retaining the relative coherence between the filtered comb lines. A series of modelling systems were then employed in order to understand the resonance characteristics of the fabricated devices, and to understand their performance under injection locking. Using this information, alterations to the SFP laser designs were made which were theoretically shown to provide improved performance and suitability for use in filtering coherent comb signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation addressed the issue of sustainable development at the level of individual behaviors. Environmental perceptions were obtained from people living around the biosphere reserve Chamela-Cuixmala in Jalisco, Mexico. Several environmental issues were identified by the people, such as garbage and grey water on the streets, burning plastics, and the lack of usage of recreational areas. All these issues could be addressed with a change in behavior by the villagers. Familiarization activities were conducted to gain people's trust in order to conduct a community forum. These activities included giving talks to school children and organizing workshops. Four different methodologies were generated using memetics and participation to test which would ameliorate those environmental issues identified by the people through a change in behavior. The methodologies were 1) Memes; 2) Participation and Memes; 3) Participation; 4) Neither Participation nor Memes. A meme is an idea expressed within a linguistic structure or architecture that provides it with self-disseminating and self-protecting characteristics within and among the minds of individuals congruent with their values, beliefs and filters. Four villages were chosen as the treatments, and one as the control, for a total of five experimental villages. A different behavior was addressed in each treatment village (garbage, grey-water, burning plastics, recreation.) A nonequivalent control-group design was established. A pretest was conducted in all five villages; the methodologies were tested in the four treatment villages; a posttest was conducted on the five villages. The pretest and posttest consisted in measuring sensory specific indicators which are manifestations of behavior that can either be seen, smelled, touched, heard or tasted. Statistically significant differences in behavior from the control were found for two of the methodologies 1) Memes (p=0.0403) and 2) Participation and Memes (p=0.0064). For the methodologies of 3) Participation alone and 4) Neither, the differences were not significant (p=0.8827, p=0.5627 respectively). When using memes, people's behavior improved when compared to the control. Participation alone did not generate a significant difference. Participation aided in the generation of the memes. Memetics is a tool that can be used to establish a linkage between human behavior and ecological health.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermal-optical analysis is a conventional method for classifying carbonaceous aerosols as organic carbon (OC) and elemental carbon (EC). This article examines the effects of three different temperature protocols on the measured EC. For analyses of parallel punches from the same ambient sample, the protocol with the highest peak helium-mode temperature (870°C) gives the smallest amount of EC, while the protocol with the lowest peak helium-mode temperature (550°C) gives the largest amount of EC. These differences are observed when either sample transmission or reflectance is used to define the OC/EC split. An important issue is the effect of the peak helium-mode temperature on the relative rate at which different types of carbon with different optical properties evolve from the filter. Analyses of solvent-extracted samples are used to demonstrate that high temperatures (870°C) lead to premature EC evolution in the helium-mode. For samples collected in Pittsburgh, this causes the measured EC to be biased low because the attenuation coefficient of pyrolyzed carbon is consistently higher than that of EC. While this problem can be avoided by lowering the peak helium-mode temperature, analyses of wood smoke dominated ambient samples and levoglucosan-spiked filters indicate that too low helium-mode peak temperatures (550°C) allow non-light absorbing carbon to slip into the oxidizing mode of the analysis. If this carbon evolves after the OC/EC split, it biases the EC measurements high. Given the complexity of ambient aerosols, there is unlikely to be a single peak helium-mode temperature at which both of these biases can be avoided. Copyright © American Association for Aerosol Research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Much research on acculturation around global experiences has focused on the “type” of overseas experience—e.g. expatriate, repatriate, inpatriate, flexpatriate. The experiences of people in those categories and across various demographics (single/married/divorced; gender; sexual orientation) can differ dramatically. In addition, given the explosion of people working in global business, some global business citizens could well fit into several of those various types of experiences over the course of their careers. In this paper, we propose to push in a somewhat different direction and explore something that for us would be quite new. Rather than focusing on the various categories and their resulting experiences, we take a step back to consider what attributes and ways of thinking a global citizen may need to become better as a global business citizen, regardless of type of experience. The question is less one of “Who am I” than “How can I become better?”. Essentially, we would like to explore what might be required in moving the global citizen from thinking about “global mindset” to “global mindsponge.” When we hear the term mindset, we think of a certain way of thinking that stays rather fixed. So part of the challenge of the paper will be to define and examine what mindsponge might mean in the global context—what does it take to unlearn or squeeze out certain ways of thinking or behaving before absorbing and reshaping new ways of thinking and behaving? Moreover, how might that become part of a natural and regular way of operating, especially in a rapidly changing developing country like Vietnam, in particular? At this early stage, we think of mind sponge as a mechanism that encourages flexibility and receptiveness through a process of using multiple filters and more focus on creativity, or doing things differently to improve an organization or individual performance. Our goal is to develop a basic conceptual framework for “mindsponge,” drawing upon a broad literature review as well as several unstructured interviews, to assess whether the idea of mindsponge helps people perceive that they are more culturally versatile and culturally mobile, regardless of whether they work in or outside of their “home environment.” We hope this would enhance their ability to shape an emerging set of cultural values that erases the divide between “foreign” and “local” cultural differences that so often dominates in emerging economies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel system to be used in the rehabilitation of patients with forearm injuries. The system uses surface electromyography (sEMG) recordings from a wireless sleeve to control video games designed to provide engaging biofeedback to the user. An integrated hardware/software system uses a neural net to classify the signals from a user’s muscles as they perform one of a number of common forearm physical therapy exercises. These classifications are used as input for a suite of video games that have been custom-designed to hold the patient’s attention and decrease the risk of noncompliance with the physical therapy regimen necessary to regain full function in the injured limb. The data is transmitted wirelessly from the on-sleeve board to a laptop computer using a custom-designed signal-processing algorithm that filters and compresses the data prior to transmission. We believe that this system has the potential to significantly improve the patient experience and efficacy of physical therapy using biofeedback that leverages the compelling nature of video games.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

User supplied knowledge and interaction is a vital component of a toolkit for producing high quality parallel implementations of scalar FORTRAN numerical code. In this paper we consider the necessary components that such a parallelisation toolkit should possess to provide an effective environment to identify, extract and embed user relevant user knowledge. We also examine to what extent these facilities are available in leading parallelisation tools; in particular we discuss how these issues have been addressed in the development of the user interface of the Computer Aided Parallelisation Tools (CAPTools). The CAPTools environment has been designed to enable user exploration, interaction and insertion of user knowledge to facilitate the automatic generation of very efficient parallel code. A key issue in the user's interaction is control of the volume of information so that the user is focused on only that which is needed. User control over the level and extent of information revealed at any phase is supplied using a wide variety of filters. Another issue is the way in which information is communicated. Dependence analysis and its resulting graphs involve a lot of sophisticated rather abstract concepts unlikely to be familiar to most users of parallelising tools. As such, considerable effort has been made to communicate with the user in terms that they will understand. These features, amongst others, and their use in the parallelisation process are described and their effectiveness discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fourth-order partial differential equation (PDE) proposed by You and Kaveh (You-Kaveh fourth-order PDE), which replaces the gradient operator in classical second-order nonlinear diffusion methods with a Laplacian operator, is able to avoid blocky effects often caused by second-order nonlinear PDEs. However, the equation brought forward by You and Kaveh tends to leave the processed images with isolated black and white speckles. Although You and Kaveh use median filters to filter these speckles, median filters can blur the processed images to some extent, which weakens the result of You-Kaveh fourth-order PDE. In this paper, the reason why You-Kaveh fourth-order PDE can leave the processed images with isolated black and white speckles is analyzed, and a new fourth-order PDE based on the changes of Laplacian (LC fourth-order PDE) is proposed and tested. The new fourth-order PDE preserves the advantage of You-Kaveh fourth-order PDE and avoids leaving isolated black and white speckles. Moreover, the new fourth-order PDE keeps the boundary from being blurred and preserves the nuance in the processed images, so, the processed images look very natural.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simulated in situ incubation box has been compared with in situ exposure for 14C production measurements in an estuarine environment. Measurements were made over the course of 14 months, mainly in the Tamar estuary; production rates ranged from less than 1 mg C m−2h−1 to 350 mg C m−2h−1 and there was no significant difference between results from the two methods. In the estuarine waters investigated, the simulated in situ incubator with neutral density filters, used with a Secchi disc to determine sampling depths, gives a satisfactory estimate of in situ primary production.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current knowledge about the spread of pathogens in aquatic environments is scarce probably because bacteria, viruses, algae and their toxins tend to occur at low concentrations in water, making them very difficult to measure directly. The purpose of this study was the development and validation of tools to detect pathogens in freshwater systems close to an urban area. In order to evaluate anthropogenic impacts on water microbiological quality, a phylogenetic microarray was developed in the context of the EU project µAQUA to detect simultaneously numerous pathogens and applied to samples from two different locations close to an urban area located upstream and downstream of Rome in the Tiber River. Furthermore, human enteric viruses were also detected. Fifty liters of water were collected and concentrated using a hollow-fiber ultrafiltration approach. The resultant concentrate was further size-fractionated through a series of decreasing pore size filters. RNA was extracted from pooled filters and hybridized to the newly designed microarray to detect pathogenic bacteria, protozoa and toxic cyanobacteria. Diatoms as indicators of the water quality status, were also included in the microarray to evaluate water quality. The microarray results gave positive signals for bacteria, diatoms, cyanobacteria and protozoa. Cross validation of the microarray was performed using standard microbiological methods for the bacteria. The presence of oral-fecal transmitted human enteric-viruses were detected using q-PCR. Significant concentrations of Salmonella, Clostridium, Campylobacter and Staphylococcus as well as Hepatitis E Virus (HEV), noroviruses GI (NoGGI) and GII (NoGII) and human adenovirus 41 (ADV 41) were found in the Mezzocammino site, whereas lower concentrations of other bacteria and only the ADV41 virus was recovered at the Castel Giubileo site. This study revealed that the pollution level in the Tiber River was considerably higher downstream rather than upstream of Rome and the downstream location was contaminated by emerging and re-emerging pathogens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The potential for physical removal of Mycobacterium avium ssp. paratuberculosis (M. paratuberculosis) from milk by centrifugation and microfiltration was investigated by simulating commercial processing conditions in the laboratory by means of a microcentrifuge and syringe filters, respectively. Results indicated that both centrifugation of preheated milk (60 degrees C) at 7000 x g for 10 s, and microfiltration through a filter of pore size 1.2 mu m, were capable of removing up to 95-99.9% of M. paratuberculosis cells from spiked whole milk and Middlebrook 7H9 broth suspensions, respectively. Centrifugation and microfiltration may therefore have potential application within the dairy industry as pretreatments to reduce M. paratuberculosis contamination of raw milk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method for simulation of acoustical bores, useful in the context of sound synthesis by physical modeling of woodwind instruments, is presented. As with previously developed methods, such as digital waveguide modeling (DWM) [Smith, Comput. Music J. 16, pp 74-91 (1992)] and the multi convolution algorithm (MCA) [Martinez et al., J. Acoust. Soc. Am. 84, pp 1620-1627 (1988)], the approach is based on a one-dimensional model of wave propagation in the bore. Both the DWM method and the MCA explicitly compute the transmission and reflection of wave variables that represent actual traveling pressure waves. The method presented in this report, the wave digital modeling (WDM) method, avoids the typical limitations associated with these methods by using a more general definition of the wave variables. An efficient and spatially modular discrete-time model is constructed from the digital representations of elemental bore units such as cylindrical sections, conical sections, and toneholes. Frequency-dependent phenomena, such as boundary losses, are approximated with digital filters. The stability of a simulation of a complete acoustic bore is investigated empirically. Results of the simulation of a full clarinet show that a very good concordance with classic transmission-line theory is obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report on a study comparing absolute K-alpha yield from Ti foils measured with a calibrated system of an X-ray CCD coupled to a curved LiF Von-Hamos crystal spectrometer to the difference in the signals measured simultaneously with two similar photodiodes fitted with two different filters. Our data indicate that a combination of photodiodes with different filters could be developed into an alternative and inexpensive diagnostic for monitoring single shot pulsed emission in a narrow band of X-ray region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We observed a stellar occultation by Titan on 2003 November 14 from La Palma Observatory using ULTRACAM with three Sloan filters: u, g, and i (358, 487, and 758 nm, respectively). The occultation probed latitudes 2°?S and 1°?N during immersion and emersion, respectively. A prominent central flash was present in only the i filter, indicating wavelength-dependent atmospheric extinction. We inverted the light curves to obtain six lower-limit temperature profiles between 335 and 485 km (0.04 and 0.003 mb) altitude. The i profiles agreed with the temperature measured by the Huygens Atmospheric Structure Instrument [Fulchignoni, M., and 43 colleagues, 2005. Nature 438, 785 791] above 415 km (0.01 mb). The profiles obtained from different wavelength filters systematically diverge as altitude decreases, which implies significant extinction in the light curves. Applying an extinction model [Elliot, J.L., Young, L.A., 1992. Astron. J. 103, 991 1015] gave the altitudes of line of sight optical depth equal to unity: 396±7 and 401±20 km (u immersion and emersion); 354±7 and 387±7 km (g immersion and emersion); and 336±5 and 318±4 km (i immersion and emersion). Further analysis showed that the optical depth follows a power law in wavelength with index 1.3±0.2. We present a new method for determining temperature from scintillation spikes in the occulting body's atmosphere. Temperatures derived with this method are equal to or warmer than those measured by the Huygens Atmospheric Structure Instrument. Using the highly structured, three-peaked central flash, we confirmed the shape of Titan's middle atmosphere using a model originally derived for a previous Titan occultation [Hubbard, W.B., and 45 colleagues, 1993. Astron. Astrophys. 269, 541 563].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims. The aim of this work is to constrain the size, composition and surface properties of asteroids (2867) Steins and (21) Lutetia, targets of the Rosetta mission. Rosetta is en route to rendezvous with comet 67P/Churyumov-Gerasimenko.

Methods. Thermal-Infrared N-band observations for Lutetia and Steins were obtained using, respectively, TIMMI2 on the ESO 3.6-m telescope at La Silla and VISIR at the UT3 VLT telescope on Cerro Paranal; visible light curves for Steins were obtained using NTT+SUSI2, while R-band photometry for Lutetia was obtained with the 2.0-m Faulkes Telescope North on Haleakala. For Steins, the NEATM model was used to constrain its visible geometric albedo and beaming parameter. A detailed thermophysical model was implemented and used to analyze our set of observations of Lutetia as well as previous reported measurements.

Results. The visible photometry of Steins was used along with data from the literature to yield a slope parameter of G=0.32(-0.11)(+0.14). Problems during the observations led to the loss of measurements on two of the three N-band filters requested for Steins. Using the remaining data and the polarimetric albedo recently published, we were able to constrain the thermal beaming parameter as eta > 1.2, which is more similar to near-Earth asteroids and suggests either high thermal inertia or a very rough surface. For Lutetia, the best fit visible geometric albedo obtained with our model and the reported observation is p(nu)=0.129, significantly lower than that obtained if one applies the same model to previously reported measurements. The discrepancy cannot be explained solely by assuming inhomogeneities in the surface properties and we suggest that the most plausible explanation is the presence of one or more large craters on the northern hemisphere. For both sets of measurements, the implied single scattering albedo of Lutetia is compatible with laboratory measurements of carbonaceous chondrite meteorites.