947 resultados para Small-signal transfer functions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a method of signal preprocessing under active monitoring. Suppose we want to solve the inverse problem of getting the response of a medium to one powerful signal, which is equivalent to obtaining the transmission function of the medium, but do not have an opportunity to conduct such an experiment (it might be too expensive or harmful for the environment). Practically the problem can be reduced to obtaining the transmission function of the medium. In this case we can conduct a series of experiments of relatively low power and superpose the response signals. However, this method is conjugated with considerable loss of information (especially in the high frequency domain) due to fluctuations of the phase, the frequency and the starting time of each individual experiment. The preprocessing technique presented in this paper allows us to substantially restore the response of the medium and consequently to find a better estimate for the transmission function. This technique is based on expanding the initial signal into the system of orthogonal functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Similar to classic Signal Detection Theory (SDT), recent optimal Binary Signal Detection Theory (BSDT) and based on it Neural Network Assembly Memory Model (NNAMM) can successfully reproduce Receiver Operating Characteristic (ROC) curves although BSDT/NNAMM parameters (intensity of cue and neuron threshold) and classic SDT parameters (perception distance and response bias) are essentially different. In present work BSDT/NNAMM optimal likelihood and posterior probabilities are analytically analyzed and used to generate ROCs and modified (posterior) mROCs, optimal overall likelihood and posterior. It is shown that for the description of basic discrimination experiments in psychophysics within the BSDT a ‘neural space’ can be introduced where sensory stimuli as neural codes are represented and decision processes are defined, the BSDT’s isobias curves can simultaneously be interpreted as universal psychometric functions satisfying the Neyman-Pearson objective, the just noticeable difference (jnd) can be defined and interpreted as an atom of experience, and near-neutral values of biases are observers’ natural choice. The uniformity or no-priming hypotheses, concerning the ‘in-mind’ distribution of false-alarm probabilities during ROC or overall probability estimations, is introduced. The BSDT’s and classic SDT’s sensitivity, bias, their ROC and decision spaces are compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a proliferation of categorization schemes in the scientific literature that have mostly been developed from psychologists’ understanding of the nature of linguistic interactions. This has a led to problems in defining question types used by interviewers. Based on the principle that the overarching purpose of an interview is to elicit information and that questions can function both as actions in their own right and as vehicles for other actions, a Conversational Analysis approach was used to analyse a small number of police interviews. The analysis produced a different categorization of question types and, in particular, the conversational turns fell into two functional types: (i) Topic Initiation Questions and (ii) Topic Facilitation Questions. We argue that forensic interviewing requires a switch of focus from the ‘words’ used by interviewers in question types to the ‘function’ of conversational turns within interviews.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Record small and low loss slow light optical signal processing devices are proposed and demonstrated using the recently invented Surface Nanoscale Axial Photonics (SNAP) technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For wireless power transfer (WPT) systems, communication between the primary side and the pickup side is a challenge because of the large air gap and magnetic interferences. A novel method, which integrates bidirectional data communication into a high-power WPT system, is proposed in this paper. The power and data transfer share the same inductive link between coreless coils. Power/data frequency division multiplexing technique is applied, and the power and data are transmitted by employing different frequency carriers and controlled independently. The circuit model of the multiband system is provided to analyze the transmission gain of the communication channel, as well as the power delivery performance. The crosstalk interference between two carriers is discussed. In addition, the signal-to-noise ratios of the channels are also estimated, which gives a guideline for the design of mod/demod circuits. Finally, a 500-W WPT prototype has been built to demonstrate the effectiveness of the proposed WPT system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate numerically the effect of ultralong Raman laser fiber amplifier design parameters, such as span length, pumping distribution and grating reflectivity, on the RIN transfer from the pump to the transmitted signal. Comparison is provided to the performance of traditional second-order Raman amplified schemes, showing a relative performance penalty for ultralong laser systems that gets smaller as span length increases. We show that careful choice of system parameters can be used to partially offset such penalty. © 2010 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous results in our laboratory suggest that the (CG) 4 segments whether present in a right-handed or a left-handed conformation form distinctive junctions with adjacent random sequences. These junctions and their associated sequences have unique structural and thermodynamic properties that may be recognized by DNA-binding molecules. This study probes these sequences by using the following small ligands: actinomycin D, 1,4-bis(((di(aminoethyl)amino)ethyl)amino)anthracene-9,10-dione, ametantrone, and tris(phenanthroline)ruthenium (II). These ligands may recognize the distinctive features associated to the (CG)4 segment and its junctions and thus interact preferentially near these sequences. Restriction enzyme inhibition assays were used to determine whether or not binding interactions took place, and to approximate locations of these interactions. These binding studies are first carried out using two small synthetic oligomers BZ-III and BZ-IV. The (5meCG)4 segment present in BZ-III adopts the Z-conformation in the presence of 50 m M Co(NH3)63+. In BZ-IV, the unmethylated (CG)4 segment changes to a non-B conformation in the presence of 50 m M Co(NH3)63+. BZ-IV, containing the (CG)4 segment, was inserted into a clone plasmid then digested with the restriction enzyme Hinf I to produce a larger fragment that contains the (CG)4 segment. The results obtained on the small oligomers and on the larger fragment for restriction enzyme Mbo I indicate that 1,4-bis(((di(aminoethyl)amino)ethyl)amino)anthracene-9,10-dione binds more efficiently at or near the (CG)4 segment. Restriction enzymes EcoRV, Sac I and Not I with cleavage sites upstream and downstream of the (CG)4 insert were used to further localize binding interactions in the vicinity of the (CG)4 insert. RNA polymerase activity was studied in a plasmid which contained the (CG)4 insert downstream from the promoter sites of SP6 and T7 RNA polymerases. Activities of these two polymerases were studied in the presence of each one of the ligands used throughout the study. Only actinomycin D and spider, which bind at or near the (CG)4 segment, alter the activities of SP6 and T7 RNA polymerases. Surprisingly, enhancement of polymerase activity was observed in the presence of very low concentrations of actinomycin D. These results suggest that the conformational features of (CG) segments may serve in regulatory functions of DNA. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A high resolution study of the quasielastic 2 H(e, e'p)n reaction was performed in Hall A at the Thomas Jefferson Accelerator Facility in Newport News, Virginia. The measurements were performed at a central momentum transfer of : q: ∼ 2400 MeV/c, and at a central energy transfer of ω ∼ 1500 MeV, a four momentum transfer Q2 = 3.5 (GeV/c)2, covering missing momenta from 0 to 0.5 GeV/c. The majority of the measurements were performed at Φ = 180° and a small set of measurements were done at Φ = 0°. The Hall A High Resolution Spectrometers (HRS) were used to detect coincident electrons and protons, respectively. Absolute 2H(e, e'p) n cross sections were obtained as a function of the recoiling neutron scattering angle with respect to [special characters omitted]. The experimental results were compared to a Plane Wave Impulse Approximation (PWIA) model and to a calculation that includes Final State Interaction (FSI) effects. Experimental 2H(e, e'p)n cross sections were determined with an estimated systematic uncertainty of 7%. The general features of the measured cross sections are reproduced by Glauber based calculations that take the motion of the bound nucleons into account (GEA). Final State Interactions (FSI) contributions were found to depend strongly on the angle of the recoiling neutron with respect to the momentum transfer and on the missing momentum. We found a systematic deviation of the theoretical prediction of about 30%. At small &thetas; nq (&thetas;nq < 60°) the theory overpredicts the cross section while at large &thetas; nq (&thetas;nq > 80°) the theory underestimates the cross sections. We observed an enhancement of the cross section, due to FSI, of about 240%, as compared to PWIA, for a missing momentum of 0.4 GeV/c at an angle of 75°. For missing momentum of 0.5 GeV/c the enhancement of the cross section due to the same FSI effects, was about 270%. This is in agreement with GEA. Standard Glauber calculations predict this large contribution to occur at an angle of 90°. Our results show that GEA better describes the 2H(e, e'p)n reaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QCD predicts Color Transparency (CT), which refers to nuclear medium becoming transparent to a small color neutral object produced in high momentum transfer reactions, due to reduced strong interaction. Despite several studies at BNL, SLAC, FNAL, DESY and Jefferson Lab, a definitive signal for CT still remains elusive. In this dissertation, we present the results of a new study at Jefferson Lab motivated by theoretical calculations that suggest fully exclusive measurement of coherent rho meson electroproduction off the deuteron is a favorable channel for studying CT. Vector meson production has a large cross section at high energies, and the deuteron is the best understood and simplest nuclear system. Exclusivity allows the production and propagation to be controlled separately by controlling Q 2, lf (formation length), lc (coherence length) and t. This control is important as the rapid expansion of small objects increases their interaction probability and masks CT. The CT signal is investigated in a ratio of cross sections at high t (where re-scattering is significant) to low t (where single nucleon reactions dominate). The results are presented over a Q2 range of 1 to 3 GeV2 based on the data taken with beam energy of 6 GeV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines how public management practitioners in small and medium-sized Florida cities perceive globalization and its impact on public management practice. Using qualitative analysis, descriptive statistics and factor analysis methods, data obtained from a survey and semi-structured interviews were studied to comprehend how public managers view the management and control of their municipalities in a time of globalization. The study shows that the public managers’ perceptions of globalization and its impact on public management in Florida’s small-medium cities are nuanced. Whereas some public managers feel that globalization has significant impacts on municipalities’ viability, others opine that globalization has no local impact. The study further finds that globalization processes are perceived as altering the public management functions of decision-making, economic development and service delivery in some small-medium cities in Florida as a result of transnational shifts, rapidly changing technologies, and municipalities’ heightened involvement in the global economy. The study concludes that the globalization discourse does not resonate among some public managers in Florida’s small-medium cities in ways implied in extant literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Communication has become an essential function in our civilization. With the increasing demand for communication channels, it is now necessary to find ways to optimize the use of their bandwidth. One way to achieve this is by transforming the information before it is transmitted. This transformation can be performed by several techniques. One of the newest of these techniques is the use of wavelets. Wavelet transformation refers to the act of breaking down a signal into components called details and trends by using small waveforms that have a zero average in the time domain. After this transformation the data can be compressed by discarding the details, transmitting the trends. In the receiving end, the trends are used to reconstruct the image. In this work, the wavelet used for the transformation of an image will be selected from a library of available bases. The accuracy of the reconstruction, after the details are discarded, is dependent on the wavelets chosen from the wavelet basis library. The system developed in this thesis takes a 2-D image and decomposes it using a wavelet bank. A digital signal processor is used to achieve near real-time performance in this transformation task. A contribution of this thesis project is the development of DSP-based test bed for the future development of new real-time wavelet transformation algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many U.S. students do not perform well on mathematics assessments with respect to algebra topics such as linear functions, a building-block for other functions. Poor achievement of U.S. middle school students in this topic is a problem. U.S. eighth graders have had average mathematics scores on international comparison tests such as Third International Mathematics Science Study, later known as Trends in Mathematics and Science Study, (TIMSS)-1995, -99, -03, while Singapore students have had highest average scores. U.S. eighth grade average mathematics scores improved on TIMMS-2007 and held steady onTIMMS-2011. Results from national assessments, PISA 2009 and 2012 and National Assessment of Educational Progress of 2007, 2009, and 2013, showed a lack of proficiency in algebra. Results of curriculum studies involving nations in TIMSS suggest that elementary textbooks in high-scoring countries were different than elementary textbooks and middle grades texts were different with respect to general features in the U.S. The purpose of this study was to compare treatments of linear functions in Singapore and U.S. middle grades mathematics textbooks. Results revealed features currently in textbooks. Findings should be valuable to constituencies who wish to improve U.S. mathematics achievement. Portions of eight Singapore and nine U.S. middle school student texts pertaining to linear functions were compared with respect to 22 features in three categories: (a) background features, (b) general features of problems, and (c) specific characterizations of problem practices, problem-solving competency types, and transfer of representation. Features were coded using a codebook developed by the researcher. Tallies and percentages were reported. Welch's t-tests and chi-square tests were used, respectively, to determine whether texts differed significantly for the features and if codes were independent of country. U.S. and Singapore textbooks differed in page appearance and number of pages, problems, and images. Texts were similar in problem appearance. Differences in problems related to assessment of conceptual learning. U.S. texts contained more problems requiring (a) use of definitions, (b) single computation, (c) interpreting, and (d) multiple responses. These differences may stem from cultural differences seen in attitudes toward education. Future studies should focus on density of page, spiral approach, and multiple response problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study a small circuit of coupled nonlinear elements to investigate general features of signal transmission through networks. The small circuit itself is perceived as building block for larger networks. Individual dynamics and coupling are motivated by neuronal systems: We consider two types of dynamical modes for an individual element, regular spiking and chattering and each individual element can receive excitatory and/or inhibitory inputs and is subjected to different feedback types (excitatory and inhibitory; forward and recurrent). Both, deterministic and stochastic simulations are carried out to study the input-output relationships of these networks. Major results for regular spiking elements include frequency locking, spike rate amplification for strong synaptic coupling, and inhibition-induced spike rate control which can be interpreted as a output frequency rectification. For chattering elements, spike rate amplification for low frequencies and silencing for large frequencies is characteristic