978 resultados para feedlot receiving


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vancouver Lake, located adjacent to the Columbia River and just north of the Vancouver-Portland metropolitan area, is a "dying" lake. Although all lakes die naturally in geologic time through the process of eutrophication,* Vancouver Lake is dying more rapidly due to man's activities and due to the resultant increased accumulation of sediment, chemicals, and wastes. Natural eutrophication takes thousands of years, whereas man-made modifications can cause the death of a lake in decades. Vancouver Lake does, however, have the potential of becoming a valuable water resource asset for the area, due particularly to its location near the Columbia River which can be used as a source of "flushing" water to improve the quality of Vancouver Lake. (Document pdf contains 59 pages) Community interest in Vancouver Lake has waxed and waned. Prior to World War II, there were relatively few plans for discussions about the Lake and its surrounding land area. A plan to drain the Lake for farming was prohibited by the city council and county commissioners. Interest increased in 1945 when the federal government considered developing the Lake as a berthing harbor for deactivated ships at which time a preliminary proposal was prepared by the City. The only surface water connection between Vancouver Lake and the Columbia River, except during floods, is Lake River. The Lake now serves as a receiving body of water for Lake River tidal flow and surface flow from creeks and nearby land areas. Seasonally, these flows are heavily laden with sediment, septic tank drainage, fertilizers and drainage from cattle yards. Construction and gravel pit operations increase the sediment loads entering the Lake from Burnt Bridge Creek and Salmon Creek (via Lake River by tidal action). The tidal flats at the north end of Vancouver Lake are evidence of this accumulation. Since 1945, the buildup of sediment and nutrients created by man's activities has accelerated the growth of the large water plants and algae which contribute to the degeneration of the Lake. Flooding from the Columbia River, as in 1968, has added to the deposition in Vancouver Lake. The combined effect of these human and natural activities has changed Vancouver Lake into a relatively useless body of shallow water supporting some wildlife, rough fish, and shallow draft boats. It is still pleasant to view from the hills to the east. Because precipitation and streamflow are the lowest during the summer and early fall, water quantity and quality conditions are at their worst when the potential of the Lake for water-based recreation is the highest. Increased pollution of the Lake has caused a larger segment of the community to become concerned. Land use and planning studies were undertaken on the Columbia River lowlands and a wide variety of ideas were proposed for improving the quality of the water-land environment in order to enhance the usefulness of the area. In 1966, the College of Engineering Research Division at Washington State University (WSU0 in Pullman, Washington, was contacted by the Port of Vancouver to determine possible alternatives for restoring Vancouver Lake. Various proposals were prepared between 1966 and 1969. During the summer and fall of 1967, a study was made by WSU on the existing water quality in the Lake. In 1969, the current studies were funded to establish a data base for considering a broad range of alternative solutions for improving the quantity and quality of Vancouver Lake. Until these studies were undertaken, practically no data on a continuous nature were available on Vancouver Lake, Lake River, or their tributaries. (Document pdf contains 59 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN]Forty feedlot steers were fed a barleygrain-based finishing diet typical for western Canada, with two levels of supplementary vitamin E (468 or 1068 IU head_1 d_1) and the effect on backfat trans-18:1 isomeric profile was determined. Feeding 1068 IU vitamin E reduced the total trans-18:1 content in backfat (P<0.01), as well as the percentage of trans 10-18:1 (P<0.001), which are related to an increased risk for cardiovascular diseases. On the other hand, trans 11-18:1 (vaccenic acid) the precursor for cis 9,trans 11- 18:2 (rumenic acid), which have several purported health benefits, increased (P<0.01). Vitamin E could, therefore, be used to decrease trans-18:1 in beef and improve its isomeric profile.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is strong evidence to suggest that ground-water nitrate concentrations have increased in recent years and further increases are expected along portions of the central Gulf coast of Florida. Much of the nitrate enriched groundwater is discharged into surface waters through numerous freshwater springs that are characteristic of the area and the potential for eutrophication of their receiving waters is a legitimate concern. To test the potential effects of elevated nutrient concentrations on the periphyton community an in situ nutrient addition experiment was conducted in the spring-fed Chassahowitzka River, FL, USA, during the summer of 1999. Plastic tubes housing arrays of glass microscope slides were suspended in the stream. Periphyton colonizing the microscope slides was subjected to artificial increases in nitrogen, phosphorus or a combination of both. Slides from each tube were collected at 3- to 4- day intervals and the periphyton communities were measured for chlorophyll concentration. The addition of approximately 10 μg/L of phosphate above ambient concentrations significantly increased the amount of periphyton on artificial substrates relative to controls; the addition of approximately 100 μg/L of nitrate above ambient concentrations did not. The findings from this experiment implicated phosphorus, rather than nitrogen, as the nutrient that potentially limits periphyton growth in this system.(PDF contains 4 pages.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Toxic chemicals can enter the marine environment through numerous routes: stormwater runoff, industrial point source discharges, municipal wastewater discharges, atmospheric deposition, accidental spills, illegal dumping, pesticide applications and agricultural practices. Once they enter a receiving system, toxicants often become bound to suspended particles and increase in density sufficiently to sink to the bottom. Sediments are one of the major repositories of contaminants in aquatic envronments. Furthermore, if they become sufficiently contaminated sediments can act as sources of toxicants to important biota. Sediment quality data are direct indicators of the health of coastal aquatic habitats. Sediment quality investigations conducted by the National Oceanic and Atmospheric Administration (NOAA) and others have indicated that toxic chemicals are found in the sediments and biota of some estuaries in South Carolina and Georgia (NOAA, 1992). This report documents the toxicity of sediments collected within five selected estuaries: Savannah River, Winyah Bay, Charleston Harbor, St. Simons Sound, and Leadenwah Creek (Figure 1). (PDF contains 292 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"The Role of Latin in the Early Modern World: Linguistic identity and nationalism 1350-1800". Contributions from the conference held at the Universitat Autònoma de Barcelona, Casa Convalescència, 5-6 May 2010. Edited by Alejandro Coroleu, Carlo Caruso & Andrew Laird

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When it comes to measuring blade-tip clearance or blade-tip timing in turbines, reflective intensity-modulated optical fiber sensors overcome several traditional limitations of capacitive, inductive or discharging probe sensors. This paper presents the signals and results corresponding to the third stage of a multistage turbine rig, obtained from a transonic wind-tunnel test. The probe is based on a trifurcated bundle of optical fibers that is mounted on the turbine casing. To eliminate the influence of light source intensity variations and blade surface reflectivity, the sensing principle is based on the quotient of the voltages obtained from the two receiving bundle legs. A discrepancy lower than 3% with respect to a commercial sensor was observed in tip clearance measurements. Regarding tip timing measurements, the travel wave spectrum was obtained, which provides the average vibration amplitude for all blades at a particular nodal diameter. With this approach, both blade-tip timing and tip clearance measurements can be carried out simultaneously. The results obtained on the test turbine rig demonstrate the suitability and reliability of the type of sensor used, and suggest the possibility of performing these measurements in real turbines under real working conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Query-by-Example Spoken Term Detection (QbE STD) aims at retrieving data from a speech data repository given an acoustic query containing the term of interest as input. Nowadays, it has been receiving much interest due to the high volume of information stored in audio or audiovisual format. QbE STD differs from automatic speech recognition (ASR) and keyword spotting (KWS)/spoken term detection (STD) since ASR is interested in all the terms/words that appear in the speech signal and KWS/STD relies on a textual transcription of the search term to retrieve the speech data. This paper presents the systems submitted to the ALBAYZIN 2012 QbE STD evaluation held as a part of ALBAYZIN 2012 evaluation campaign within the context of the IberSPEECH 2012 Conference(a). The evaluation consists of retrieving the speech files that contain the input queries, indicating their start and end timestamps within the appropriate speech file. Evaluation is conducted on a Spanish spontaneous speech database containing a set of talks from MAVIR workshops(b), which amount at about 7 h of speech in total. We present the database metric systems submitted along with all results and some discussion. Four different research groups took part in the evaluation. Evaluation results show the difficulty of this task and the limited performance indicates there is still a lot of room for improvement. The best result is achieved by a dynamic time warping-based search over Gaussian posteriorgrams/posterior phoneme probabilities. This paper also compares the systems aiming at establishing the best technique dealing with that difficult task and looking for defining promising directions for this relatively novel task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) convened a workshop on Evaluating Approaches and Technologies for Monitoring Organic Contaminants in the Aquatic Environment in Ann Arbor, MI on July 21-23, 2006. The primary objectives of this workshop were to: 1) identify the priority management information needs relative to organic contaminant loading; 2) explore the most appropriate approaches to estimating mass loading; and 3) evaluate the current status of the sensor technology. To meet these objectives, a mixture of leading research scientists, resource managers, and industry representatives were brought together for a focused two-day workshop. The workshop featured four plenary talks followed by breakout sessions in which arranged groups of participants where charged to respond to a series of focused discussion questions. At present, there are major concerns about the inadequacies in approaches and technologies for quantifying mass emissions and detection of organic contaminants for protecting municipal water supplies and receiving waters. Managers use estimates of land-based contaminant loadings to rivers, lakes, and oceans to assess relative risk among various contaminant sources, determine compliance with regulatory standards, and define progress in source reduction. However, accurately quantifying contaminant loading remains a major challenge. Loading occurs over a range of hydrologic conditions, requiring measurement technologies that can accommodate a broad range of ambient conditions. In addition, in situ chemical sensors that provide a means for acquiring continuous concentration measurements are still under development, particularly for organic contaminants that typically occur at low concentrations. Better approaches and strategies for estimating contaminant loading, including evaluations of both sampling design and sensor technologies, need to be identified. The following general recommendations were made in an effort to advance future organic contaminant monitoring: 1. Improve the understanding of material balance in aquatic systems and the relationship between potential surrogate measures (e.g., DOC, chlorophyll, particle size distribution) and target constituents. 2. Develop continuous real-time sensors to be used by managers as screening measures and triggers for more intensive monitoring. 3. Pursue surrogate measures and indicators of organic pollutant contamination, such as CDOM, turbidity, or non-equilibrium partitioning. 4. Develop continuous field-deployable sensors for PCBs, PAHs, pyrethroids, and emerging contaminants of concern and develop strategies that couple sampling approaches with tools that incorporate sensor synergy (i.e., measure appropriate surrogates along with the dissolved organics to allow full mass emission estimation).[PDF contains 20 pages]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Development pressure throughout the coastal areas of the United States continues to build, particularly in the southeast (Allen and Lu 2003, Crossett et al. 2004). It is well known that development alters watershed hydrology: as land becomes covered with surfaces impervious to rain, water is redirected from groundwater recharge and evapotranspiration to stormwater runoff, and as the area of impervious cover increases, so does the volume and rate of runoff (Schueler 1994, Corbett et al. 1997). Pollutants accumulate on impervious surfaces, and the increased runoff with urbanization is a leading cause of nonpoint source pollution (USEPA 2002). Sediment, chemicals, bacteria, viruses, and other pollutants are carried into receiving water bodies, resulting in degraded water quality (Holland et al. 2004, Sanger et al. 2008). (PDF contains 5 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gold Coast Water is responsible for the management of the water and wastewater assets of the City of the Gold Coast on Australia’s east coast. Treated wastewater is released at the Gold Coast Seaway on an outgoing tide in order for the plume to be dispersed before the tide changes and renters the Broadwater estuary. Rapid population growth over the past decade has placed increasing demands on the receiving waters for the release of the City’s effluent. The Seaway SmartRelease Project is designed to optimise the release of the effluent from the City’s main wastewater treatment plant in order to minimise the impact of the estuarine water quality and maximise the cost efficiency of pumping. In order to do this an optimisation study that involves water quality monitoring, numerical modelling and a web based decision support system was conducted. An intensive monitoring campaign provided information on water levels, currents, winds, waves, nutrients and bacterial levels within the Broadwater. These data were then used to calibrate and verify numerical models using the MIKE by DHI suite of software. The decision support system then collects continually measured data such as water levels, interacts with the WWTP SCADA system, runs the models in forecast mode and provides the optimal time window to release the required amount of effluent from the WWTP. The City’s increasing population means that the length of time available for releasing the water with minimal impact may be exceeded within 5 years. Optimising the release of the treated water through monitoring, modelling and a decision support system has been an effective way of demonstrating the limited environmental impact of the expected short term increase in effluent disposal procedures. (PDF contains 5 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Signal processing techniques play important roles in the design of digital communication systems. These include information manipulation, transmitter signal processing, channel estimation, channel equalization and receiver signal processing. By interacting with communication theory and system implementing technologies, signal processing specialists develop efficient schemes for various communication problems by wisely exploiting various mathematical tools such as analysis, probability theory, matrix theory, optimization theory, and many others. In recent years, researchers realized that multiple-input multiple-output (MIMO) channel models are applicable to a wide range of different physical communications channels. Using the elegant matrix-vector notations, many MIMO transceiver (including the precoder and equalizer) design problems can be solved by matrix and optimization theory. Furthermore, the researchers showed that the majorization theory and matrix decompositions, such as singular value decomposition (SVD), geometric mean decomposition (GMD) and generalized triangular decomposition (GTD), provide unified frameworks for solving many of the point-to-point MIMO transceiver design problems.

In this thesis, we consider the transceiver design problems for linear time invariant (LTI) flat MIMO channels, linear time-varying narrowband MIMO channels, flat MIMO broadcast channels, and doubly selective scalar channels. Additionally, the channel estimation problem is also considered. The main contributions of this dissertation are the development of new matrix decompositions, and the uses of the matrix decompositions and majorization theory toward the practical transmit-receive scheme designs for transceiver optimization problems. Elegant solutions are obtained, novel transceiver structures are developed, ingenious algorithms are proposed, and performance analyses are derived.

The first part of the thesis focuses on transceiver design with LTI flat MIMO channels. We propose a novel matrix decomposition which decomposes a complex matrix as a product of several sets of semi-unitary matrices and upper triangular matrices in an iterative manner. The complexity of the new decomposition, generalized geometric mean decomposition (GGMD), is always less than or equal to that of geometric mean decomposition (GMD). The optimal GGMD parameters which yield the minimal complexity are derived. Based on the channel state information (CSI) at both the transmitter (CSIT) and receiver (CSIR), GGMD is used to design a butterfly structured decision feedback equalizer (DFE) MIMO transceiver which achieves the minimum average mean square error (MSE) under the total transmit power constraint. A novel iterative receiving detection algorithm for the specific receiver is also proposed. For the application to cyclic prefix (CP) systems in which the SVD of the equivalent channel matrix can be easily computed, the proposed GGMD transceiver has K/log_2(K) times complexity advantage over the GMD transceiver, where K is the number of data symbols per data block and is a power of 2. The performance analysis shows that the GGMD DFE transceiver can convert a MIMO channel into a set of parallel subchannels with the same bias and signal to interference plus noise ratios (SINRs). Hence, the average bit rate error (BER) is automatically minimized without the need for bit allocation. Moreover, the proposed transceiver can achieve the channel capacity simply by applying independent scalar Gaussian codes of the same rate at subchannels.

In the second part of the thesis, we focus on MIMO transceiver design for slowly time-varying MIMO channels with zero-forcing or MMSE criterion. Even though the GGMD/GMD DFE transceivers work for slowly time-varying MIMO channels by exploiting the instantaneous CSI at both ends, their performance is by no means optimal since the temporal diversity of the time-varying channels is not exploited. Based on the GTD, we develop space-time GTD (ST-GTD) for the decomposition of linear time-varying flat MIMO channels. Under the assumption that CSIT, CSIR and channel prediction are available, by using the proposed ST-GTD, we develop space-time geometric mean decomposition (ST-GMD) DFE transceivers under the zero-forcing or MMSE criterion. Under perfect channel prediction, the new system minimizes both the average MSE at the detector in each space-time (ST) block (which consists of several coherence blocks), and the average per ST-block BER in the moderate high SNR region. Moreover, the ST-GMD DFE transceiver designed under an MMSE criterion maximizes Gaussian mutual information over the equivalent channel seen by each ST-block. In general, the newly proposed transceivers perform better than the GGMD-based systems since the super-imposed temporal precoder is able to exploit the temporal diversity of time-varying channels. For practical applications, a novel ST-GTD based system which does not require channel prediction but shares the same asymptotic BER performance with the ST-GMD DFE transceiver is also proposed.

The third part of the thesis considers two quality of service (QoS) transceiver design problems for flat MIMO broadcast channels. The first one is the power minimization problem (min-power) with a total bitrate constraint and per-stream BER constraints. The second problem is the rate maximization problem (max-rate) with a total transmit power constraint and per-stream BER constraints. Exploiting a particular class of joint triangularization (JT), we are able to jointly optimize the bit allocation and the broadcast DFE transceiver for the min-power and max-rate problems. The resulting optimal designs are called the minimum power JT broadcast DFE transceiver (MPJT) and maximum rate JT broadcast DFE transceiver (MRJT), respectively. In addition to the optimal designs, two suboptimal designs based on QR decomposition are proposed. They are realizable for arbitrary number of users.

Finally, we investigate the design of a discrete Fourier transform (DFT) modulated filterbank transceiver (DFT-FBT) with LTV scalar channels. For both cases with known LTV channels and unknown wide sense stationary uncorrelated scattering (WSSUS) statistical channels, we show how to optimize the transmitting and receiving prototypes of a DFT-FBT such that the SINR at the receiver is maximized. Also, a novel pilot-aided subspace channel estimation algorithm is proposed for the orthogonal frequency division multiplexing (OFDM) systems with quasi-stationary multi-path Rayleigh fading channels. Using the concept of a difference co-array, the new technique can construct M^2 co-pilots from M physical pilot tones with alternating pilot placement. Subspace methods, such as MUSIC and ESPRIT, can be used to estimate the multipath delays and the number of identifiable paths is up to O(M^2), theoretically. With the delay information, a MMSE estimator for frequency response is derived. It is shown through simulations that the proposed method outperforms the conventional subspace channel estimator when the number of multipaths is greater than or equal to the number of physical pilots minus one.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some of the most exciting developments in the field of nucleic acid engineering include the utilization of synthetic nucleic acid molecular devices as gene regulators, as disease marker detectors, and most recently, as therapeutic agents. The common thread between these technologies is their reliance on the detection of specific nucleic acid input markers to generate some desirable output, such as a change in the copy number of an mRNA (for gene regulation), a change in the emitted light intensity (for some diagnostics), and a change in cell state within an organism (for therapeutics). The research presented in this thesis likewise focuses on engineering molecular tools that detect specific nucleic acid inputs, and respond with useful outputs.

Four contributions to the field of nucleic acid engineering are presented: (1) the construction of a single nucleotide polymorphism (SNP) detector based on the mechanism of hybridization chain reaction (HCR); (2) the utilization of a single-stranded oligonucleotide molecular Scavenger as a means of enhancing HCR selectivity; (3) the implementation of Quenched HCR, a technique that facilitates transduction of a nucleic acid chemical input into an optical (light) output, and (4) the engineering of conditional probes that function as sequence transducers, receiving target signal as input and providing a sequence of choice as output. These programmable molecular systems are conceptually well-suited for performing wash-free, highly selective rapid genotyping and expression profiling in vitro, in situ, and potentially in living cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lan honetan, elkarrekintzaren bidez irakasle batek ikasleengan ezagutzak eraikitzeko zer mekanismoren bidez lortzen duen aztertu da, laguntzak nola egokitzen dituen eta laguntzeari nola uzten dion ikertuz. Helburua, elkarrekintzan eragina daukaten mekanismoak identifikatzea, deskribatzea eta ulertzea izan da. Horretarako, ikasleen iniziatiba bultzatzen duen giroan oinarritutako behaketa parte-hartzailea erabili da, errealitatea sistematikoki behatu eta eman diren gertaerak jaso, deskribatu, aztertu eta interpretatu direlarik. Behin transkribatutako saioak aztertuta, elkarrekintza segmentuak atera dira, horietan gehien errepikatzen diren jokabide-patroiak jasoz. Laguntzak egokitzean zein laguntzeari uztean, irakaslea ikasleen iniziatibetatik abiatzen dela ondorioztatu da.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O estudo situa-se no âmbito das investigações voltadas para documentos que sistematizam o trabalho do professor, dentre eles, o manual do professor que organiza a atividade docente junto ao livro didático. A dissertação analisa manuais do professor dos livros de espanhol selecionados pelo MEC para serem distribuídos a professores, em 2005, em função da lei 11161 da obrigatoriedade do ensino da língua espanhola para o ensino médio em todo o território nacional. O objetivo foi identificar imagens discursivas de docente e de ensino de espanhol como língua estrangeira neles construídas. Os fundamentos teóricos adotados advêm da Análise do Discurso de base enunciativa, além de recorrermos aos conceitos de dialogismo (BAKHTIN, 1979) e de polifonia (BAKHTIN, 1979; DUCROT, 1987). Os resultados nesses manuais apontam para a construção de imagens de professor como: aquele que necessita ser guiado em sua tarefa, incapaz de realizar suas escolhas em sala de aula, um professor recebedor de ordens; desatualizado com as metodologias de ensino atuais, necessitando, portanto, de atualização profissional; há ainda um professor que busca instruções facilitadoras para seu trabalho. Já no que se refere à visão de língua, deparamo-nos com um manual que dá ênfase ao trabalho com a leitura, voltado para uma concepção que valoriza aspectos discursivos; outros que afirmam seguir a abordagem comunicativa, com um olhar para a língua em uso, porém adotam procedimentos pautados numa concepção de língua como estrutura e/ou misturam ambas perspectivas