993 resultados para Texas


Relevância:

10.00% 10.00%

Publicador:

Resumo:

PMSM drive with high dynamic response is the attractive solution for servo applications like robotics, machine tools, electric vehicles. Vector control is widely accepted control strategy for PMSM control, which enables decoupled control of torque and flux, this improving the transient response of torque and speed. As the vector control demands exhaustive real time computations, so the present work is implemented using TI DSP 320C240. Presently position and speed controller have been successfully tested. The feedback information used is shaft (rotor) position from the incremental encoder and two motor currents. We conclude with the hope to extend the present experimental set up for further research related to PMSM applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The educational kit was developed for power electronics and drives. The need and purpose of this kit is to train engineers with current technology of digital control in power electronics. The DSP is the natural choice as it is able to perform high speed calculations required in power electronics. The educational kit consists of a DSP platform using TI DSP TMS320C50 starter kit, an inverter and an induction machine-dc machine set. A set of experiments have been prepared so that DSP programming can be learned easily in a smooth fashion. Here the application presented is open loop V/F control of three phase induction using sine pulse width modulation technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

N-acetyl-β-D-glucosaminidaasi (NAGaasi) on glykosidaaseihin kuuluva, solujen lysosomeissa esiintyvä entsyymi, jota vapautuu maitoon utaretulehduksen aikana vaurioituneista utareen epiteelisoluista, neutrofiileistä ja makrofageista. NAGaasientsyymiaktiivisuuden on useissa tutkimuksissa havaittu korreloivan utareen tulehdustilan ja maidon soluluvun (SCC) kanssa ja sitä on ehdotettu käytettäväksi utareen epiteelisolutuhon mittaamiseen yksinään tai yhdistettynä SCC:n määritykseen. Koska saostuminen ei häiritse NAGaasi-entsyymiaktiivisuuden mittausta maidosta, entsyymiaktiivisuus ei muutu maitoa säilytettäessä ja entsyymin mittaaminen on melko yksinkertaista ja nopeaa, menetelmä vaikuttaisi sopivan hyvin seulontatestiksi piileville utaretulehduksille. NAGaasin käyttö on toistaiseksi rajoittunut tutkimuskäyttöön. Sen hyödyntämistä vaikeuttaa se, että terveille lehmille eri tutkimuksissa määritetyissä NAGaasi-entsyymiaktiivisuuden viitearvoissa on suurta vaihtelua. NAGaasi-entsyymiaktiivisuus maidossa on useiden tutkimusten mukaan korkeampi silloin, kun tulehduksen on aiheuttanut jokin merkittävä patogeeni kuin silloin, kun tulehduksen taustalla on vähäpätöinen patogeeni. Lypsykauden vaiheen on havaittu vaikuttavan maidon NAGaasi-entsyymiaktiivisuuteen siten, että aktiivisuudet ovat korkeampia heti poikimisen jälkeen ja lypsykauden lopulla. On myös havaittu, että normaalimaidossa NAGaasi-entsyymiaktiivisuus on hieman korkeampi loppumaidossa kuin alkumaidossa. Poikimakerran vaikutuksista NAGaasi-entsyymiaktiivisuuteen on ristiriitaisia tutkimustuloksia. Tämän tutkimuksen tavoitteena oli määrittää NAGaasi-entsyymiaktiivisuuden viitearvot terveen sekä utaretulehdusta sairastavan lypsylehmän maidossa, sekä selvittää tulehduksen voimakkuuden, aiheuttajapatogeenin, poikimakerran ja lypsykauden vaiheen vaikutusta kyseisen entsyymin aktiivisuuteen maidossa. Tutkimusaineistossa oli mukana kaikkiaan 838 vuosina 2000–2010 otettua maitonäytettä 62 eri lypsykarjatilalta Suomesta ja Virosta. Normaalimaidon NAGaasi-entsyymiaktiivisuuden viitearvot määritettiin yhdeksältä suomalaiselta lypsykarjatilalta kerätyistä 196 maitonäytteestä, jotka täyttivät asettamamme normaalimaidon kriteerit. Normaalimaidon kriteerit olivat seuraavat: SCC < 100 000, lehmällä ei ole utaretulehduksen oireita, poikimisesta on kulunut aikaa yli 30 vuorokautta ja edellisestä lypsystä yli 6 tuntia. NAGaasi-entsyymiaktiivisuus mitattiin modifioidulla Mattilan menetelmällä (Mattila 1985) vakioiduissa olosuhteissa. Aineisto analysoitiin käyttäen Stata Intercooler tilasto-ohjelman versiota 11.0 (Stata Corporation, Texas, USA). Maidon NAGaasientsyymiaktiivisuuteen terveessä neljänneksessä vaikuttavia tekijöitä tutkittiin lineaarisella sekamallilla, jossa sekoittavana tekijänä oli tila. SCC:n ja NAGaasi-entsyymiaktiivisuuden korrelaatiota arvioitiin terveillä lehmillä, piilevää utaretulehdusta sairastaneilla lehmillä ja koko aineistossa. Korrelaatiot laskettiin Pearsonin korrelaatiokertoimella. Tilastollisesti merkitsevänä raja-arvona kaikissa analyyseissä pidettiin p < 0.05. Normaalimaidon NAGaasi-entsyymiaktiivisuuden viitearvoiksi lehmillä, joilla poikimisesta oli kulunut yli 30 vrk, saatiin 0,09–1,04 pmol/min/μl maitoa. Verrattuna normaalimaidon NAGaasi-entsyymiaktiivisuuksien keskiarvoon (0,56) ja piilevää utaretulehdusta sairastaneiden lehmien NAGaasi-entsyymiaktiivisuuksien keskiarvoon (2,49), kliinistä utaretulehdusta sairastavien lehmien maidon NAGaasi-entsyymiaktiivisuus oli keskimäärin selvästi korkeampi (16,65). Keskiarvoissa oli selvä ero paikallisoireisten (12,24) ja yleisoireisten (17,74) lehmien välillä. Terveiden neljännesten maitonäytteistä määritetyn NAGaasi-entsyymiaktiivisuuden ja SCC:n välillä ei havaittu korrelaatiota. Piilevässä utaretulehduksessa havaittiin positiivinen korrelaatio (0,74) maidon NAGaasientsyymiaktiivisuuden ja SCC:n välillä. NAGaasi-entsyymiaktiivisuuteen vaikuttivat tilastollisesti merkitsevästi SCC, poikimisesta kulunut aika ja poikimakerta. Eri patogeeniryhmien osalta havaitsimme, että neljänneksissä, joista eristettiin vähäpätöinen patogeeni, NAGaasi-entsyymiaktiivisuus oli selvästi matalampi kuin neljänneksissä, joista eristettiin merkittävä patogeeni. NAGaasi-entsyymiaktiivisuuden keskiarvoksi vähäpätöisille patogeeneille (KNS, koryneformi) saatiin 2,82 ja merkittäville patogeeneille (S. aureus, Str. uberis, Str, agalactiae, Str. dysgalactiae, E.coli) 16,87.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review work initiated and inspired by Sudarshan in relativistic dynamics, beam optics, partial coherence theory, Wigner distribution methods, multimode quantum optical squeezing, and geometric phases. The 1963 No Interaction Theorem using Dirac's instant form and particle World Line Conditions is recalled. Later attempts to overcome this result exploiting constrained Hamiltonian theory, reformulation of the World Line Conditions and extending Dirac's formalism, are reviewed. Dirac's front form leads to a formulation of Fourier Optics for the Maxwell field, determining the actions of First Order Systems (corresponding to matrices of Sp(2,R) and Sp(4,R)) on polarization in a consistent manner. These groups also help characterize properties and propagation of partially coherent Gaussian Schell Model beams, leading to invariant quality parameters and the new Twist phase. The higher dimensional groups Sp(2n,R) appear in the theory of Wigner distributions and in quantum optics. Elegant criteria for a Gaussian phase space function to be a Wigner distribution, expressions for multimode uncertainty principles and squeezing are described. In geometric phase theory we highlight the use of invariance properties that lead to a kinematical formulation and the important role of Bargmann invariants. Special features of these phases arising from unitary Lie group representations, and a new formulation based on the idea of Null Phase Curves, are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is found that the inclusion of higher derivative terms in the gravitational action along with concepts of phase transition and spontaneous symmetry breaking leads to some novel consequence. The Ricci scalar plays the dual role, like a physical field as well as a geometrical field. One gets Klein-Gordon equation for the emerging field and the corresponding quanta of geometry are called Riccions. For the early universe the model removes singularity along with inflation. In higher dimensional gravity the Riccions can break into spin half particle and antiparticle along with breaking of left-right symmetry. Most tantalizing consequences is the emergence of the physical universe from the geometry in the extreme past. Riccions can Bose condense and may account for the dark matter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we construct low decoding complexity STBCs by using the Pauli matrices as linear dispersion matrices. In this case the Hurwitz-Radon orthogonality condition is shown to be easily checked by transferring the problem to $\mathbb{F}_4$ domain. The problem of constructing low decoding complexity STBCs is shown to be equivalent to finding certain codes over $\mathbb{F}_4$. It is shown that almost all known low complexity STBCs can be obtained by this approach. New codes are given that have the least known decoding complexity in particular ranges of rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents an adaptive Fourier filtering technique and a relaying scheme based on a combination of a digital band-pass filter along with a three-sample algorithm, for applications in high-speed numerical distance protection. To enhance the performance of above-mentioned technique, a high-speed fault detector has been used. MATLAB based simulation studies show that the adaptive Fourier filtering technique provides fast tripping for near faults and security for farther faults. The digital relaying scheme based on a combination of digital band-pass filter along with three-sample data window algorithm also provides accurate and high-speed detection of faults. The paper also proposes a high performance 16-bit fixed point DSP (Texas Instruments TMS320LF2407A) processor based hardware scheme suitable for implementation of the above techniques. To evaluate the performance of the proposed relaying scheme under steady state and transient conditions, PC based menu driven relay test procedures are developed using National Instruments LabVIEW software. The test signals are generated in real time using LabVIEW compatible analog output modules. The results obtained from the simulation studies as well as hardware implementations are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research is designed to develop a new technique for site characterization in a three-dimensional domain. Site characterization is a fundamental task in geotechnical engineering practice, as well as a very challenging process, with the ultimate goal of estimating soil properties based on limited tests at any half-space subsurface point in a site.In this research, the sandy site at the Texas A&M University's National Geotechnical Experimentation Site is selected as an example to develop the new technique for site characterization, which is based on Artificial Neural Networks (ANN) technology. In this study, a sequential approach is used to demonstrate the applicability of ANN to site characterization. To verify its robustness, the proposed new technique is compared with other commonly used approaches for site characterization. In addition, an artificial site is created, wherein soil property values at any half-space point are assumed, and thus the predicted values can compare directly with their corresponding actual values, as a means of validation. Since the three-dimensional model has the capability of estimating the soil property at any location in a site, it could have many potential applications, especially in such case, wherein the soil properties within a zone are of interest rather than at a single point. Examples of soil properties of zonal interest include soil type classification and liquefaction potential evaluation. In this regard, the present study also addresses this type of applications based on a site located in Taiwan, which experienced liquefaction during the 1999 Chi-Chi, Taiwan, Earthquake.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of the study is to examine the accuracy of and differences among simulated streamflows driven by rainfall estimates from a network of 22 rain gauges spread over a 2,170 km2 watershed, NEXRAD Stage III radar data, and Tropical Rainfall Measuring Mission (TRMM) 3B42 satellite data. The Gridded Surface Subsurface Hydrologic Analysis (GSSHA), a physically based, distributed parameter, grid-structured, hydrologic model, was used to simulate the June-2002 flooding event in the Upper Guadalupe River watershed in south central Texas. There were significant differences between the rainfall fields estimated by the three types of measurement technologies. These differences resulted in even larger differences in the simulated hydrologic response of the watershed. In general, simulations driven by radar rainfall yielded better results than those driven by satellite or rain-gauge estimates. This study also presents an overview of effects of land cover changes on runoff and stream discharge. The results demonstrate that, for major rainfall events similar to the 2002 event, the effect of urbanization on the watershed in the past two decades would not have made any significant effect on the hydrologic response. The effect of urbanization on the hydrologic response increases as the size of the rainfall event decreases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper focuses on reliability based design of bridge abutments when subjected to earthquake loading. Planar failure surface has been used in conjunction with pseudo-dynamic approach to compute the seismic active earth pressures on the bridge abutment. The proposed pseudo dynamic method, considers the effects of strain localization in the backfill soil and associated post-peak reduction in the shear resistance from peak to residual values along a previously formed failure plane, phase difference in shear waves and soil amplification along with the horizontal seismic accelerations. Four modes of stability viz. sliding, overturning, eccentricity and bearing capacity of the foundation soil are considered for the reliability analysis. The influence of various design parameters on the seismic reliability indices against four modes of failure is presented, following the suggestions of Japan Road Association, Caltrans Bridge Design Specifications and U.S Department of the Army.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Each new generation of GPUs vastly increases the resources available to GPGPU programs. GPU programming models (like CUDA) were designed to scale to use these resources. However, we find that CUDA programs actually do not scale to utilize all available resources, with over 30% of resources going unused on average for programs of the Parboil2 suite that we used in our work. Current GPUs therefore allow concurrent execution of kernels to improve utilization. In this work, we study concurrent execution of GPU kernels using multiprogram workloads on current NVIDIA Fermi GPUs. On two-program workloads from the Parboil2 benchmark suite we find concurrent execution is often no better than serialized execution. We identify that the lack of control over resource allocation to kernels is a major serialization bottleneck. We propose transformations that convert CUDA kernels into elastic kernels which permit fine-grained control over their resource usage. We then propose several elastic-kernel aware concurrency policies that offer significantly better performance and concurrency compared to the current CUDA policy. We evaluate our proposals on real hardware using multiprogrammed workloads constructed from benchmarks in the Parboil 2 suite. On average, our proposals increase system throughput (STP) by 1.21x and improve the average normalized turnaround time (ANTT) by 3.73x for two-program workloads when compared to the current CUDA concurrency implementation.