868 resultados para end user modes of operation
Resumo:
Key message Log-end splitting is one of the single most important defects in veneer logs. We show that log-end splitting in the temperate plantation species Eucalyptus nitens varies across sites and within-tree log position and increases with time in storage. Context Log-end splitting is one of the single most important defects in veneer logs because it can substantially reduce the recovery of veneer sheets. Eucalyptus nitens can develop log-end splits, but factors affecting log-end splitting in this species are not well understood. Aims The present study aims to describe the effect of log storage and steaming on the development of log-end splitting in logs from different plantations and log positions within the tree. Methods The study was conducted on upper and lower logs from each of 41 trees from three 20–22-year-old Tasmanian E. nitens plantations. Log-end splitting was assessed immediately after felling, after transport and storage in a log-yard, and just before peeling. A pre-peeling steam treatment was applied to half the logs. Results Site had a significant effect on splitting, and upper logs split more than lower logs with storage. Splitting increased with tree diameter breast height (DBH), but this relationship varied with site. The most rapidly growing site had more splitting even after accounting for DBH. No significant effect of steaming was detected. Conclusion Log-end splitting varied across sites and within-tree log position and increased with time in storage.
Resumo:
In this thesis we study a series of multi-user resource-sharing problems for the Internet, which involve distribution of a common resource among participants of multi-user systems (servers or networks). We study concurrently accessible resources, which for end-users may be exclusively accessible or non-exclusively. For all kinds we suggest a separate algorithm or a modification of common reputation scheme. Every algorithm or method is studied from different perspectives: optimality of protocols, selfishness of end users, fairness of the protocol for end users. On the one hand the multifaceted analysis allows us to select the most suited protocols among a set of various available ones based on trade-offs of optima criteria. On the other hand, the future Internet predictions dictate new rules for the optimality we should take into account and new properties of the networks that cannot be neglected anymore. In this thesis we have studied new protocols for such resource-sharing problems as the backoff protocol, defense mechanisms against Denial-of-Service, fairness and confidentiality for users in overlay networks. For backoff protocol we present analysis of a general backoff scheme, where an optimization is applied to a general-view backoff function. It leads to an optimality condition for backoff protocols in both slot times and continuous time models. Additionally we present an extension for the backoff scheme in order to achieve fairness for the participants in an unfair environment, such as wireless signal strengths. Finally, for the backoff algorithm we suggest a reputation scheme that deals with misbehaving nodes. For the next problem -- denial-of-service attacks, we suggest two schemes that deal with the malicious behavior for two conditions: forged identities and unspoofed identities. For the first one we suggest a novel most-knocked-first-served algorithm, while for the latter we apply a reputation mechanism in order to restrict resource access for misbehaving nodes. Finally, we study the reputation scheme for the overlays and peer-to-peer networks, where resource is not placed on a common station, but spread across the network. The theoretical analysis suggests what behavior will be selected by the end station under such a reputation mechanism.
Resumo:
Wireless access is expected to play a crucial role in the future of the Internet. The demands of the wireless environment are not always compatible with the assumptions that were made on the era of the wired links. At the same time, new services that take advantage of the advances in many areas of technology are invented. These services include delivery of mass media like television and radio, Internet phone calls, and video conferencing. The network must be able to deliver these services with acceptable performance and quality to the end user. This thesis presents an experimental study to measure the performance of bulk data TCP transfers, streaming audio flows, and HTTP transfers which compete the limited bandwidth of the GPRS/UMTS-like wireless link. The wireless link characteristics are modeled with a wireless network emulator. We analyze how different competing workload types behave with regular TPC and how the active queue management, the Differentiated services (DiffServ), and a combination of TCP enhancements affect the performance and the quality of service. We test on four link types including an error-free link and the links with different Automatic Repeat reQuest (ARQ) persistency. The analysis consists of comparing the resulting performance in different configurations based on defined metrics. We observed that DiffServ and Random Early Detection (RED) with Explicit Congestion Notification (ECN) are useful, and in some conditions necessary, for quality of service and fairness because a long queuing delay and congestion related packet losses cause problems without DiffServ and RED. However, we observed situations, where there is still room for significant improvements if the link-level is aware of the quality of service. Only very error-prone link diminishes the benefits to nil. The combination of TCP enhancements improves performance. These include initial window of four, Control Block Interdependence (CBI) and Forward RTO recovery (F-RTO). The initial window of four helps a later starting TCP flow to start faster but generates congestion under some conditions. CBI prevents slow-start overshoot and balances slow start in the presence of error drops, and F-RTO reduces unnecessary retransmissions successfully.
Resumo:
Väitöskirjassani tarkastelen informaatiohyödykkeiden ja tekijänoikeuksien taloustiedettä kahdesta eri perspektiivistä. Niistä ensimmäinen kuuluu endogeenisen kasvuteorian alaan. Väitöskirjassani yleistän ”pool of knowledge” -tyyppisen endogeenisen kasvumallin tilanteeseen, jossa patentoitavissa olevalla innovaatiolla on minimikoko, ja jossa uudenlaisen tuotteen patentoinut yritys voi menettää monopolinsa tuotteeseen jäljittelyn johdosta. Mallin kontekstissa voidaan analysoida jäljittelyn ja innovaatioilta vaaditun ”minimikoon” vaikutuksia hyvinvointiin ja talouskasvuun. Kasvun maksimoiva imitaation määrä on mallissa aina nolla, mutta hyvinvoinnin maksimoiva imitaation määrä voi olla positiivinen. Talouskasvun ja hyvinvoinnin maksimoivalla patentoitavissa olevan innovaation ”minimikoolla” voi olla mikä tahansa teoreettista maksimia pienempi arvo. Väitöskirjani kahdessa jälkimmäisessä pääluvussa tarkastelen informaatiohyödykkeiden kaupallista piratismia mikrotaloustieteellisen mallin avulla. Informaatiohyödykkeistä laittomasti tehtyjen kopioiden tuotantokustannukset ovat pienet, ja miltei olemattomat silloin kun niitä levitetään esimerkiksi Internetissä. Koska piraattikopioilla on monta eri tuottajaa, niiden hinnan voitaisiin mikrotaloustieteen teorian perusteella olettaa laskevan melkein nollaan, ja jos näin kävisi, kaupallinen piratismi olisi mahdotonta. Mallissani selitän kaupallisen piratismin olemassaolon olettamalla, että piratismista saatavan rangaistuksen uhka riippuu siitä, kuinka monille kuluttajille piraatti tarjoaa laittomia hyödykkeitä, ja että se siksi vaikuttaa piraattikopioiden markkinoihin mainonnan kustannuksen tavoin. Kaupallisten piraattien kiinteiden kustannusten lisääminen on mallissani aina tekijänoikeuksien haltijan etujen mukaista, mutta ”mainonnan kustannuksen” lisääminen ei välttämättä ole, vaan se saattaa myös alentaa laillisten kopioiden myynnistä saatavia voittoja. Tämä tulos poikkeaa vastaavista aiemmista tuloksista sikäli, että se pätee vaikka tarkasteltuihin informaatiohyödykkeisiin ei liittyisi verkkovaikutuksia. Aiemmin ei-kaupallisen piratismin malleista on usein johdettu tulos, jonka mukaan informaatiohyödykkeen laittomat kopiot voivat kasvattaa laillisten kopioiden myynnistä saatavia voittoja jos laillisten kopioiden arvo niiden käyttäjille riippuu siitä, kuinka monet muut kuluttajat käyttävät samanlaista hyödykettä ja jos piraattikopioiden saatavuus lisää riittävästi laillisten kopioiden arvoa. Väitöskirjan viimeisessä pääluvussa yleistän mallini verkkotoimialoille, ja tutkin yleistämäni mallin avulla sitä, missä tapauksissa vastaava tulos pätee myös kaupalliseen piratismiin.
Resumo:
The study analyzes the effort to build political legitimacy in the Republic of Turkey by ex-ploring a group of influential texts produced by Kemalist writers. The study explores how the Kemalist regime reproduced certain long-lasting enlightenment meta-narrative in its effort to build political legitimacy. Central in this process was a hegemonic representation of history, namely the interpretation of the Anatolian Resistance Struggle of 1919 1922 as a Turkish Revolution executing the enlightenment in the Turkish nation-state. The method employed in the study is contextualizing narratological analysis. The Kemalist texts are analyzed with a repertoire of concepts originally developed in the theory of narra-tive. By bringing these concepts together with epistemological foundations of historical sciences, the study creates a theoretical frame inside of which it is possible to highlight how initially very controversial historical representations in the end manage to construct long-lasting, emotionally and intellectually convincing bases of national identity for the secular middle classes in Turkey. The two most important explanatory concepts in this sense are di-egesis and implied reader. The diegesis refers to the ability of narrative representation to create an inherently credible story-world that works as the basis of national community. The implied reader refers to the process where a certain hegemonic narrative creates a formula of identification and a position through which any individual real-world reader of a story can step inside the narrative story-world and identify oneself as one of us of the national narra-tive. The study demonstrates that the Kemalist enlightenment meta-narrative created a group of narrative accruals which enabled generations of secular middle classes to internalize Kemalist ideology. In this sense, the narrative in question has not only worked as a tool utilized by the so-called Kemalist state-elite to justify its leadership, but has been internalized by various groups in Turkey, working as their genuine world-view. It is shown in the study that secular-ism must be seen as the core ingredient of these groups national identity. The study proposes that the enlightenment narrative reproduced in the Kemalist ideology had its origin in a simi-lar totalizing cultural narrative created in and for Europe. Currently this enlightenment project is challenged in Turkey by those who are in an attempt to give religion a greater role in Turkish society. The study argues that the enduring practice of legitimizing political power through the enlightenment meta-narrative has not only become a major factor contributing to social polarization in Turkey, but has also, in contradiction to the very real potentials for crit-ical approaches inherent in the Enlightenment tradition, crucially restricted the development of critical and rational modes of thinking in the Republic of Turkey.
Resumo:
The open development model of software production has been characterized as the future model of knowledge production and distributed work. Open development model refers to publicly available source code ensured by an open source license, and the extensive and varied distributed participation of volunteers enabled by the Internet. Contemporary spokesmen of open source communities and academics view open source development as a new form of volunteer work activity characterized by hacker ethic and bazaar governance . The development of the Linux operating system is perhaps the best know example of such an open source project. It started as an effort by a user-developer and grew quickly into a large project with hundreds of user-developer as contributors. However, in hybrids , in which firms participate in open source projects oriented towards end-users, it seems that most users do not write code. The OpenOffice.org project, initiated by Sun Microsystems, in this study represents such a project. In addition, the Finnish public sector ICT decision-making concerning open source use is studied. The purpose is to explore the assumptions, theories and myths related to the open development model by analysing the discursive construction of the OpenOffice.org community: its developers, users and management. The qualitative study aims at shedding light on the dynamics and challenges of community construction and maintenance, and related power relations in hybrid open source, by asking two main research questions: How is the structure and membership constellation of the community, specifically the relation between developers and users linguistically constructed in hybrid open development? What characterizes Internet-mediated virtual communities and how can they be defined? How do they differ from hierarchical forms of knowledge production on one hand and from traditional volunteer communities on the other? The study utilizes sociological, psychological and anthropological concepts of community for understanding the connection between the real and the imaginary in so-called virtual open source communities. Intermediary methodological and analytical concepts are borrowed from discourse and rhetorical theories. A discursive-rhetorical approach is offered as a methodological toolkit for studying texts and writing in Internet communities. The empirical chapters approach the problem of community and its membership from four complementary points of views. The data comprises mailing list discussion, personal interviews, web page writings, email exchanges, field notes and other historical documents. The four viewpoints are: 1) the community as conceived by volunteers 2) the individual contributor s attachment to the project 3) public sector organizations as users of open source 4) the community as articulated by the community manager. I arrive at four conclusions concerning my empirical studies (1-4) and two general conclusions (5-6). 1) Sun Microsystems and OpenOffice.org Groupware volunteers failed in developing necessary and sufficient open code and open dialogue to ensure collaboration thus splitting the Groupware community into volunteers we and the firm them . 2) Instead of separating intrinsic and extrinsic motivations, I find that volunteers unique patterns of motivations are tied to changing objects and personal histories prior and during participation in the OpenOffice.org Lingucomponent project. Rather than seeing volunteers as a unified community, they can be better understood as independent entrepreneurs in search of a collaborative community . The boundaries between work and hobby are blurred and shifting, thus questioning the usefulness of the concept of volunteer . 3) The public sector ICT discourse portrays a dilemma and tension between the freedom to choose, use and develop one s desktop in the spirit of open source on one hand and the striving for better desktop control and maintenance by IT staff and user advocates, on the other. The link between the global OpenOffice.org community and the local end-user practices are weak and mediated by the problematic IT staff-(end)user relationship. 4) Authoring community can be seen as a new hybrid open source community-type of managerial practice. The ambiguous concept of community is a powerful strategic tool for orienting towards multiple real and imaginary audiences as evidenced in the global membership rhetoric. 5) The changing and contradictory discourses of this study show a change in the conceptual system and developer-user relationship of the open development model. This change is characterized as a movement from hacker ethic and bazaar governance to more professionally and strategically regulated community. 6) Community is simultaneously real and imagined, and can be characterized as a runaway community . Discursive-action can be seen as a specific type of online open source engagement. Hierarchies and structures are created through discursive acts. Key words: Open Source Software, open development model, community, motivation, discourse, rhetoric, developer, user, end-user
Resumo:
Temperature dependence of the intra-molecular vibrational modes Of C-60 in the quasi-1D polymeric RbC60, across the low temperature transition at similar to50 K, has been probed through infrared (IR) and Raman spectroscopies. With the lowering of temperature, the split IR modes of RbC60 are seen to harden but below 50 K a small but definitive signature of an anomalous softening is observed. In addition, the background IR transmission shows an increase below 50 K with the opening of a well defined gap in the electronic spectrum. The implications of these results, along with those of Raman measurements, are discussed in terms of the interaction of intra-molecular phonons with electrons and spin excitations in the system. (C) 2002 Published by Elsevier Science Ltd.
Resumo:
Knowledge of protein-ligand interactions is essential to understand several biological processes and important for applications ranging from understanding protein function to drug discovery and protein engineering. Here, we describe an algorithm for the comparison of three-dimensional ligand-binding sites in protein structures. A previously described algorithm, PocketMatch (version 1.0) is optimised, expanded, and MPI-enabled for parallel execution. PocketMatch (version 2.0) rapidly quantifies binding-site similarity based on structural descriptors such as residue nature and interatomic distances. Atomic-scale alignments may also be obtained from amino acid residue pairings generated. It allows an end-user to compute database-wide, all-to-all comparisons in a matter of hours. The use of our algorithm on a sample dataset, performance-analysis, and annotated source code is also included.
Resumo:
The average time tau(r) for one end of a long, self-avoiding polymer to interact for the first time with a flat penetrable surface to which it is attached at the other end is shown here to scale essentially as the square of the chain's contour length N. This result is obtained within the framework of the Wilemski-Fixman approximation to diffusion-limited reactions, in which the reaction time is expressed as a time correlation function of a ``sink'' term. In the present work, this sink-sink correlation function is calculated using perturbation expansions in the excluded volume and the polymer-surface interactions, with renormalization group methods being used to resum the expansion into a power law form. The quadratic dependence of tau(r) on N mirrors the behavior of the average time tau(c) of a free random walk to cyclize, but contrasts with the cyclization time of a free self-avoiding walk (SAW), for which tau(r) similar to N-2.2. A simulation study by Cheng and Makarov J. Phys. Chem. B 114, 3321 (2010)] of the chain-end reaction time of an SAW on a flat impenetrable surface leads to the same N-2.2 behavior, which is surprising given the reduced conformational space a tethered polymer has to explore in order to react. (C) 2014 AIP Publishing LLC.
Resumo:
The paper presents the study of wave propagation in quasicrystals. Our interest is in the computation of the wavenumber (k(n)) and group speed (c(g)) of the phonon and phason displacement modes of one, two, and three dimensional quasicrystals. These wave parameter expressions are derived and computed using the elasto-hydrodynamic equations for quasicrystals. For the computation of the wavenumber and group speeds, we use Fourier transform approximation of the phonon and the phason displacement modes. The characteristic equations obtained are a polynomial equation of the wavenumber (k(n)), with frequency as a parameter. The corresponding group speeds (c(g)) for different frequencies are then computed from the wavenumber k(n). The variation of wavenumber and group speeds with frequency is plotted for the 1-D quasicrystal, 2-D decagonal Al-Ni-Co quasicrystals, and 3-D icosahedral Al-Pd-Mn and Zn-Mg-Sc quasicrystals. From the wavenumber and group speeds plots, we obtain the cut-off frequencies for different spatial wavenumber eta(m). The results show that for 1-D, 2-D, and 3-D quasicrystals, the phonon displacement modes are non-dispersive for low values of eta(m) and becomes dispersive for increasing values of eta(m). The cut-off frequencies are not observed for very low values of eta(m), whereas the cut-off frequency starts to appear with increasing eta(m). The group speeds of the phason displacement modes are orders of magnitude lower than that of the phonon displacement modes, showing that the phason modes do not propagate, and they are essentially the diffusive modes. The group speeds of the phason modes are also not influenced by eta(m). The group speeds for the 2-D quasicrystal at 35 kHz is also simulated numerically using Galerkin spectral finite element methods in frequency domain and is compared with the results obtained using wave propagation analysis. The effect of the phonon and phason elastic constants on the group speeds is studied using 3-D icosahedral Al-Pd-Mn and Zn-Mg-Sc quasicrystals. It is also shown that the phason elastic constants and the coupling coefficient do not affect the group speeds of the phonon displacement modes. (C) 2015 AIP Publishing LLC.
Resumo:
This paper proposes a technique to suppress low-order harmonics for an open-end winding induction motor drive for a full modulation range. One side of the machine is connected to a main inverter with a dc power supply, whereas the other inverter is connected to a capacitor from the other side. Harmonic suppression (with complete elimination of fifth- and seventh-order harmonics) is achieved by realizing dodecagonal space vectors using a combined pulsewidth modulation (PWM) control for the two inverters. The floating capacitor voltage is inherently controlled during the PWM operation. The proposed PWM technique is shown to be valid for the entire modulation range, including overmodulation and six-step mode of operation of the main inverter. Experimental results have been presented to validate the proposed technique.
Resumo:
In this paper, we try to establish the equivalence or similarity in the thermal and physiochemical changes in precursor droplets (cerium nitrate) in convective and radiative fields. The radiative field is created through careful heating of the droplet using a monochromatic light source (CO2 laser). The equivalence is also established for different modes of convection like droplet injected into a high-speed flow and droplet experiencing a convective flow due to acoustic streaming (levitated) only. The thermophysical changes are studied in an aqueous cerium nitrate droplet, and the dissociation of cerium nitrate to ceria is modeled using modified Kramers' reaction rate formulation. It is observed that vaporization, species accumulation, and chemical characteristics obtained in a convectively heated droplet are retained in a radiatively heated droplet by careful adjustment of the laser intensity. The timescales and ceria yield match reasonably well for both the cases. It is also noted that similar conclusions are drawn in both levitated droplet and a nonlevitated droplet.
Resumo:
A three day workshop on turbidity measurements was held at the Hawaii Institute of Marine Biology from August 3 1 to September 2, 2005. The workshop was attended by 30 participants from industry, coastal management agencies, and academic institutions. All groups recognized common issues regarding the definition of turbidity, limitations of consistent calibration, and the large variety of instrumentation that nominally measure "turbidity." The major recommendations, in order of importance for the coastal monitoring community are listed below: 1. The community of users in coastal ecosystems should tighten instrument design configurations to minimize inter-instrument variability, choosing a set of specifications that are best suited for coastal waters. The IS0 7027 design standard is not tight enough. Advice on these design criteria should be solicited through the ASTM as well as Federal and State regulatory agencies representing the majority of turbidity sensor end users. Parties interested in making turbidity measurements in coastal waters should develop design specifications for these water types rather than relying on design standards made for the analysis of drinking water. 2. The coastal observing groups should assemble a community database relating output of specific sensors to different environmental parameters, so that the entire community of users can benefit from shared information. This would include an unbiased, parallel study of different turbidity sensors, employing a variety of designs and configuration in the broadest range of coastal environments. 3. Turbidity should be used as a measure of relative change in water quality rather than an absolute measure of water quality. Thus, this is a recommendation for managers to develop their own local calibrations. See next recommendation. 4. If the end user specifically wants to use a turbidity sensor to measure a specific water quality parameter such as suspended particle concentration, then direct measurement of that water quality parameter is necessary to correlate with 'turbidity1 for a particular environment. These correlations, however, will be specific to the environment in which they are measured. This works because there are many environments in which water composition is relatively stable but varies in magnitude or concentration. (pdf contains 22 pages)
Resumo:
A series of experiments was conducted on the use of a device to passively generate vortex rings, henceforth a passive vortex generator (PVG). The device is intended as a means of propulsion for underwater vehicles, as the use of vortex rings has been shown to decrease the fuel consumption of a vehicle by up to 40% Ruiz (2010).
The PVG was constructed out of a collapsible tube encased in a rigid, airtight box. By adjusting the pressure within the airtight box while fluid was flowing through the tube, it was possible to create a pulsed jet with vortex rings via self-excited oscillations of the collapsible tube.
A study of PVG integration into an existing autonomous underwater vehicle (AUV) system was conducted. A small AUV was used to retrofit a PVG with limited alterations to the original vehicle. The PVG-integrated AUV was used for self-propelled testing to measure the hydrodynamic (Froude) efficiency of the system. The results show that the PVG-integrated AUV had a 22% increase in the Froude efficiency using a pulsed jet over a steady jet. The maximum increase in the Froude efficiency was realized when the formation time of the pulsed jet, a nondimensional time to characterize vortex ring formation, was coincident with vortex ring pinch-off. This is consistent with previous studies that indicate that the maximization of efficiency for a pulsed jet vehicle is realized when the formation of vortex rings maximizes the vortex ring energy and size.
The other study was a parameter study of the physical dimensions of a PVG. This study was conducted to determine the effect of the tube diameter and length on the oscillation characteristics such as the frequency. By changing the tube diameter and length by factors of 3, the frequency of self-excited oscillations was found to scale as f~D_0^{-1/2} L_0^0, where D_0 is the tube diameter and L_0 the tube length. The mechanism of operation is suggested to rely on traveling waves between the tube throat and the end of the tube. A model based on this mechanism yields oscillation frequencies that are within the range observed by the experiment.
Resumo:
Bulk n-lnSb is investigated at a heterodyne detector for the submillimeter wavelength region. Two modes or operation are investigated: (1) the Rollin or hot electron bolometer mode (zero magnetic field), and (2) the Putley mode (quantizing magnetic field). The highlight of the thesis work is the pioneering demonstration or the Putley mode mixer at several frequencies. For example, a double-sideband system noise temperature of about 510K was obtained using a 812 GHz methanol laser for the local oscillator. This performance is at least a factor or 10 more sensitive than any other performance reported to date at the same frequency. In addition, the Putley mode mixer achieved system noise temperatures of 250K at 492 GHz and 350K at 625 GHz. The 492 GHz performance is about 50% better and the 625 GHz is about 100% better than previous best performances established by the Rollin-mode mixer. To achieve these results, it was necessary to design a totally new ultra-low noise, room-temperature preamp to handle the higher source impedance imposed by the Putley mode operation. This preamp has considerably less input capacitance than comparably noisy, ambient designs.
In addition to advancing receiver technology, this thesis also presents several novel results regarding the physics of n-lnSb at low temperatures. A Fourier transform spectrometer was constructed and used to measure the submillimeter wave absorption coefficient of relatively pure material at liquid helium temperatures and in zero magnetic field. Below 4.2K, the absorption coefficient was found to decrease with frequency much faster than predicted by Drudian theory. Much better agreement with experiment was obtained using a quantum theory based on inverse-Bremmstrahlung in a solid. Also the noise of the Rollin-mode detector at 4.2K was accurately measured and compared with theory. The power spectrum is found to be well fit by a recent theory of non- equilibrium noise due to Mather. Surprisingly, when biased for optimum detector performance, high purity lnSb cooled to liquid helium temperatures generates less noise than that predicted by simple non-equilibrium Johnson noise theory alone. This explains in part the excellent performance of the Rollin-mode detector in the millimeter wavelength region.
Again using the Fourier transform spectrometer, spectra are obtained of the responsivity and direct detection NEP as a function of magnetic field in the range 20-110 cm-1. The results show a discernable peak in the detector response at the conduction electron cyclotron resonance frequency tor magnetic fields as low as 3 KG at bath temperatures of 2.0K. The spectra also display the well-known peak due to the cyclotron resonance of electrons bound to impurity states. The magnitude of responsivity at both peaks is roughly constant with magnet1c field and is comparable to the low frequency Rollin-mode response. The NEP at the peaks is found to be much better than previous values at the same frequency and comparable to the best long wavelength results previously reported. For example, a value NEP=4.5x10-13/Hz1/2 is measured at 4.2K, 6 KG and 40 cm-1. Study of the responsivity under conditions of impact ionization showed a dramatic disappearance of the impurity electron resonance while the conduction electron resonance remained constant. This observation offers the first concrete evidence that the mobility of an electron in the N=0 and N=1 Landau levels is different. Finally, these direct detection experiments indicate that the excellent heterodyne performance achieved at 812 GHz should be attainable up to frequencies of at least 1200 GHz.