996 resultados para Monetary Award Program (Ill.)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Common carp is one of the most important cultured freshwater fish species in the world. Its production in freshwater areas is the second largest in Europe after rainbow trout. Common carp production in Europe was 146,845 t in 2004 (FAO Fishstat Plus 2006). Common carp production is concentrated mainly in Central and Eastern Europe. In Hungary, common carp has been traditionally cultured in earthen ponds since the late 19th century, following the sharp drop in catches from natural waters, due to the regulation of main river systems. Different production technologies and unintentional selection methods resulted in a wide variety of this species. Just before the intensification of rearing technology and the exchange of stocking materials among fish farms (early sixties), “landraces” of carp were collected from practically all Hungarian fish farms into a live gene bank at the Research Institute for Fisheries, Aquaculture and Irrigation (HAKI) at Szarvas (Bakos and Gorda 1995; Bakos and Gorda 2001). In order to provide highly productive hybrids for production purposes starting from 1964, different strains and crosses between Hungarian landraces were created and tested. During the last 40 years, approximately 150 two-, three-, and four-line hybrids were produced. While developing parental lines, methods of individual selection, inbreeding, backcrossing of lines, gynogenesis and sex reversal were used. This breeding program resulted in three outstanding hybrids: “Szarvas 215 mirror” and “Szarvas P31 scaly” for pond production, and “Szarvas P34 scaly” for angling waters. Besides satisfying the needs of industry, the live gene bank helped to conserve the biological diversity of Hungarian carp landraces. Fifteen Hungarian carp landraces are still maintained today in the gene bank. Through exchange programs fifteen foreign carp strains were added to the collection from Central and Eastern Europe, as well as Southeast Asia (Bakos and Gorda 2001). Besides developing the methodology to maintain live specimens in the gene bank, the National Carp Breeding Program has been initiated in cooperation with all the key stakeholders in Hungary, namely the National Association of Fish Producers (HOSZ), the National Institute for Agricultural Quality Control (OMMI), and the Research Institute for Fisheries, Aquaculture and Irrigation (HAKI). In addition, methodologies or technologies for broodstock management and carp performance testing have been developed. This National Carp Breeding Program is being implemented successfully since the mid-1990s.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the first time in India, selective breeding work has been initiated at the Central Institute of Freshwater Aquaculture, Bhubaneswar, India in collaboration with the Institute of Aquaculture Research (AKVAFORSK), Norway. Rohu has been chosen as the model species because it enjoys the highest consumer preference among Indian major carps (IMC) although its performance was observed to be slower than other IMC. As this was the first ever selection work on any Indian major carp, many procedures and techniques for successful implementation of the programs were standardized (i.e. production of full-sib groups, establishment of model hatchery for selective breeding of carps, rearing of full-sib groups in partitioned nursery ponds, individual tagging with the Passive Integrated Transponder (PIT) tag, communal rearing, sampling, data analysis, field testing and dissemination of improved rohu). After four generations of selection, an average of 17 per cent higher growth per generation was observed in improved rohu.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY: The Coastal Change Analysis Programl (C-CAP) is developing a nationally standardized database on landcover and habitat change in the coastal regions of the United States. C-CAP is part of the Estuarine Habitat Program (EHP) of NOAA's Coastal Ocean Program (COP). C-CAP inventories coastal submersed habitats, wetland habitats, and adjacent uplands and monitors changes in these habitats on a one- to five-year cycle. This type of information and frequency of detection are required to improve scientific understanding of the linkages of coastal and submersed wetland habitats with adjacent uplands and with the distribution, abundance, and health of living marine resources. The monitoring cycle will vary according to the rate and magnitude of change in each geographic region. Satellite imagery (primarily Landsat Thematic Mapper), aerial photography, and field data are interpreted, classified, analyzed, and integrated with other digital data in a geographic information system (GIS). The resulting landcover change databases are disseminated in digital form for use by anyone wishing to conduct geographic analysis in the completed regions. C-CAP spatial information on coastal change will be input to EHP conceptual and predictive models to support coastal resource policy planning and analysis. CCAP products will include 1) spatially registered digital databases and images, 2) tabular summaries by state, county, and hydrologic unit, and 3) documentation. Aggregations to larger areas (representing habitats, wildlife refuges, or management districts) will be provided on a case-by-case basis. Ongoing C-CAP research will continue to explore techniques for remote determination of biomass, productivity, and functional status of wetlands and will evaluate new technologies (e.g. remote sensor systems, global positioning systems, image processing algorithms) as they become available. Selected hardcopy land-cover change maps will be produced at local (1:24,000) to regional scales (1:500,000) for distribution. Digital land-cover change data will be provided to users for the cost of reproduction. Much of the guidance contained in this document was developed through a series of professional workshops and interagency meetings that focused on a) coastal wetlands and uplands; b) coastal submersed habitat including aquatic beds; c) user needs; d) regional issues; e) classification schemes; f) change detection techniques; and g) data quality. Invited participants included technical and regional experts and representatives of key State and Federal organizations. Coastal habitat managers and researchers were given an opportunity for review and comment. This document summarizes C-CAP protocols and procedures that are to be used by scientists throughout the United States to develop consistent and reliable coastal change information for input to the C-CAP nationwide database. It also provides useful guidelines for contributors working on related projects. It is considered a working document subject to periodic review and revision.(PDF file contains 104 pages.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this essay, three lines of evidence are developed that sturgeons in the Chesapeake Bay and elsewhere are unusually sensitive to hypoxic conditions: 1. In comparison to other fishes, sturgeons have a limited behavioral and physiological capacity to respond to hypoxia. Basal metabolism, growth, and consumption are quite sensitive to changes in oxygen level, which may indicate a relatively poor ability by sturgeons to oxyregulate. 2. During summertime, temperatures >20 C amplify the effect of hypoxia on sturgeons and other fishes due to a temperature*oxygen "squeeze" (Coutant 1987)- In bottom waters, this interaction results in substantial reduction of habitat; in dry years, nursery habitats in the Chesapeake Bay may be particularly reduced or even eliminated. 3. While evidence for population level effects by hypoxia are circumstantial, there are corresponding trends between the absence of Atlantic sturgeon reproduction in estuaries like the Chesapeake Bay where summertime hypoxia predominates on a system-wide scale. Also, the recent and dramatic recovery of shortnose sturgeon in the Hudson River (4-fold increase in abundance from 1980 to 1995) may have been stimulated by improvement of a large portion of the nursery habitat that was restored from hypoxia to normoxia during the period 1973-1978. (PDF contains 26 pages)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: A new intervention aimed at managing patients with medically unexplained symptoms (MUS) based on a specific set of communication techniques was developed, and tested in a cluster randomised clinical trial. Due to the modest results obtained and in order to improve our intervention we need to know the GPs' attitudes towards patients with MUS, their experience, expectations and the utility of the communication techniques we proposed and the feasibility of implementing them. Physicians who took part in 2 different training programs and in a randomised controlled trial (RCT) for patients with MUS were questioned to ascertain the reasons for the doctors' participation in the trial and the attitudes, experiences and expectations of GPs about the intervention. Methods: A qualitative study based on four focus groups with GPs who took part in a RCT. A content analysis was carried out. Results: Following the RCT patients are perceived as true suffering persons, and the relationship with them has improved in GPs of both groups. GPs mostly valued the fact that it is highly structured, that it made possible a more comfortable relationship and that it could be applied to a broad spectrum of patients with psychosocial problems. Nevertheless, all participants consider that change in patients is necessary; GPs in the intervention group remarked that that is extremely difficult to achieve. Conclusion: GPs positively evaluate the communication techniques and the interventions that help in understanding patient suffering, and express the enormous difficulties in handling change in patients. These findings provide information on the direction in which efforts for improving intervention should be directed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: The main goals of the present study were: 1) to review some recommendations about how to increase lean body mass; 2) to analyse whether following scientific sources of current recommendations, visible changes can be shown or not in a participant (body composition, strength and blood analyses). METHODS: One male athlete completed 12 weeks of resistance training program and following a diet protocol. Some test were determined such as, strength 6RM, blood analyses, skindfold measurements, body perimeters and impedance test. Body composition measurements were taken 3 times during the program (before-T1, after 6 weeks of intervention period-T2 and at the end of the program-T3). On the other hand, strength tests and blood analyses were performed twice (before and after the program). RESULTS: Strength was increased in general; blood analyses showed that Creatine kinase was increased a 104% and Triglycerides level was decreased a 22.5%; in the impedance test, body mass (1.6%), lean body mass (3.5%) and Body mass index (1.7%) were increased, whereas fat mass was decreased (15.5%); relaxed and contracted biceps perimeters were also increased. CONCLUSION: A muscle hypertrophy training program mixed with an appropriate diet during 12 weeks leads to interesting adaptations related to increase in body weight, lean body mass, biceps perimeters, strength and creatine kinase levels, and a decrease in fat mass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A means of assessing the effectiveness of methods used in the numerical solution of various linear ill-posed problems is outlined. Two methods: Tikhonov' s method of regularization and the quasireversibility method of Lattès and Lions are appraised from this point of view.

In the former method, Tikhonov provides a useful means for incorporating a constraint into numerical algorithms. The analysis suggests that the approach can be generalized to embody constraints other than those employed by Tikhonov. This is effected and the general "T-method" is the result.

A T-method is used on an extended version of the backwards heat equation with spatially variable coefficients. Numerical computations based upon it are performed.

The statistical method developed by Franklin is shown to have an interpretation as a T-method. This interpretation, although somewhat loose, does explain some empirical convergence properties which are difficult to pin down via a purely statistical argument.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In response to a growing body of research on projected climate change impacts to Washington State’s coastal areas, the Washington State Department of Natural Resources’ (DNR) Aquatic Resources Program (the Program) initiated a climate change preparedness effort in 2009 via the development of a Climate Change Adaptation Strategy (the Strategy)i. The Strategy answers the question “What are the next steps that the Program can take to begin preparing for and adapting to climate change impacts in Washington’s coastal areas?” by considering how projected climate change impacts may effect: (1) Washington’s state-owned aquatic landsii, (2) the Program’s management activities, and (3) DNR’s statutorily established guidelines for managing Washington’s state-owned aquatic lands for the benefit of the public. The Program manages Washington’s state-owned aquatic lands according to the guidelines set forth in Revised Code of Washington 79-105-030, which stipulates that DNR must manage state-owned aquatic lands in a manner which provides a balance of the following public benefits: (1) Encouraging direct public uses and access; (2) Fostering water-dependent uses; (3) Ensuring environmental protection; (4) Utilizing renewable resources. (RCW 79-105-030) The law also stipulates that generating revenue in a manner consistent with these four benefits is a public benefit (RCW 79-105-030). Many of the next steps identified in the Strategy build off of recommendations provided by earlier climate change preparation and adaptation efforts in Washington State, most notably those provided by the Preparation and Adaptation Working Group, which were convened by Washington State Executive Order 70-02 in 2007, and those made in the Washington Climate Change Impacts Assessment (Climate Impacts Group, 2009). (PDF contains 4 pages)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The foundation of Habermas's argument, a leading critical theorist, lies in the unequal distribution of wealth across society. He states that in an advanced capitalist society, the possibility of a crisis has shifted from the economic and political spheres to the legitimation system. Legitimation crises increase the more government intervenes into the economy (market) and the "simultaneous political enfranchisement of almost the entire adult population" (Holub, 1991, p. 88). The reason for this increase is because policymakers in advanced capitalist democracies are caught between conflicting imperatives: they are expected to serve the interests of their nation as a whole, but they must prop up an economic system that benefits the wealthy at the expense of most workers and the environment. Habermas argues that the driving force in history is an expectation, built into the nature of language, that norms, laws, and institutions will serve the interests of the entire population and not just those of a special group. In his view, policy makers in capitalist societies are having to fend off this expectation by simultaneously correcting some of the inequities of the market, denying that they have control over people's economic circumstances, and defending the market as an equitable allocator of income. (deHaven-Smith, 1988, p. 14). Critical theory suggests that this contradiction will be reflected in Everglades policy by communicative narratives that suppress and conceal tensions between environmental and economic priorities. Habermas’ Legitimation Crisis states that political actors use various symbols, ideologies, narratives, and language to engage the public and avoid a legitimation crisis. These influences not only manipulate the general population into desiring what has been manufactured for them, but also leave them feeling unfulfilled and alienated. Also known as false reconciliation, the public's view of society as rational, and "conductive to human freedom and happiness" is altered to become deeply irrational and an obstacle to the desired freedom and happiness (Finlayson, 2005, p. 5). These obstacles and irrationalities give rise to potential crises in the society. Government's increasing involvement in Everglades under advanced capitalism leads to Habermas's four crises: economic/environmental, rationality, legitimation, and motivation. These crises are occurring simultaneously, work in conjunction with each other, and arise when a principle of organization is challenged by increased production needs (deHaven-Smith, 1988). Habermas states that governments use narratives in an attempt to rationalize, legitimize, obscure, and conceal its actions under advanced capitalism. Although there have been many narratives told throughout the history of the Everglades (such as the Everglades was a wilderness that was valued as a wasteland in its natural state), the most recent narrative, “Everglades Restoration”, is the focus of this paper.(PDF contains 4 pages)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] The objective of this study was to determine whether a short training program, using real foods, would decreased their portion-size estimation errors after training. 90 student volunteers (20.18±0.44 y old) of the University of the Basque Country (Spain) were trained in observational techniques and tested in food-weight estimation during and after a 3-hour training period. The program included 57 commonly consumed foods that represent a variety of forms (125 different shapes). Estimates of food weight were compared with actual weights. Effectiveness of training was determined by examining change in the absolute percentage error for all observers and over all foods over time. Data were analyzed using SPSS vs. 13.0. The portion-size errors decreased after training for most of the foods. Additionally, the accuracy of their estimates clearly varies by food group and forms. Amorphous was the food type estimated least accurately both before and after training. Our findings suggest that future dietitians can be trained to estimate quantities by direct observation across a wide range of foods. However this training may have been too brief for participants to fully assimilate the application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation presents a political and economic history of the federal government's program to commercialize photovoltaic energy for terrestrial use. Chapter 1 is a detailed history of the program. Chapter 2 is a brief review of the Congressional roll call voting literature. Chapter 3 develops PV benefit measures at the state and Congressional district level necessary for an econometric analysis of PV roll call voting. Chapter 4 presents the econometric analysis.

Because PV power was considerably more expensive than conventional power, the program was designed to make PV a significant power source in the long term, emphasizing research and development, although sizeable amounts have been spent for procurement (direct government purchases and indirectly through tax credits). The decentralized R and D program pursued alternative approaches in parallel, with subsequent funding dependent on earlier progress. Funding rose rapidly in the 1970s before shrinking in the 1980s. Tax credits were introduced in 1978, with the last of the credits due to expire this year.

Major issues in the program have been the appropriate magnitude of demonstrations and government procurement, whether decentralized, residential use or centralized utility generation would first be economic, the role of storage in PV, and the role of PV in a utility's generation mix.

Roll call voting on solar energy (all votes analyzed occurred from 1975-1980) was influenced in a cross-sectional sense by all the influences predicted: party and ideology, local economic benefits of the technology, local PV federal spending and manufacturing, and appropriations committee membership. The cross-sectional results for ideology are consistent with the strongly ideological character of solar energy politics and the timing of funding increases and decreases discussed in Chapter 1. Local PV spending and manufacturing was less significant than ideology or the economic benefits of the technology. Because time series analysis of the votes was not possible, it is not possible to test the role of economic benefits to the nation as a whole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In laser-target interaction, the effects of laser intensity on plasma oscillation at the front surface of targets have been investigated by one-dimensional particle in cell simulations. The periodical oscillations of the ion density and electrostatic field at the front surface of the targets are reported for the first time, which is considered as an intrinsic property of the target excited by the laser. The oscillation period depends only on initial plasma density and is irrelevant with laser intensity. Flattop structures with curves in ion phase space are found with a more intense laser pulse due to the larger amplitude variation of the electrostatic field. A simple but valid model is proposed to interpret the curves.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several types of seismological data, including surface wave group and phase velocities, travel times from large explosions, and teleseismic travel time anomalies, have indicated that there are significant regional variations in the upper few hundred kilometers of the mantle beneath continental areas. Body wave travel times and amplitudes from large chemical and nuclear explosions are used in this study to delineate the details of these variations beneath North America.

As a preliminary step in this study, theoretical P wave travel times, apparent velocities, and amplitudes have been calculated for a number of proposed upper mantle models, those of Gutenberg, Jeffreys, Lehman, and Lukk and Nersesov. These quantities have been calculated for both P and S waves for model CIT11GB, which is derived from surface wave dispersion data. First arrival times for all the models except that of Lukk and Nersesov are in close agreement, but the travel time curves for later arrivals are both qualitatively and quantitatively very different. For model CIT11GB, there are two large, overlapping regions of triplication of the travel time curve, produced by regions of rapid velocity increase near depths of 400 and 600 km. Throughout the distance range from 10 to 40 degrees, the later arrivals produced by these discontinuities have larger amplitudes than the first arrivals. The amplitudes of body waves, in fact, are extremely sensitive to small variations in the velocity structure, and provide a powerful tool for studying structural details.

Most of eastern North America, including the Canadian Shield has a Pn velocity of about 8.1 km/sec, with a nearly abrupt increase in compressional velocity by ~ 0.3 km/sec near at a depth varying regionally between 60 and 90 km. Variations in the structure of this part of the mantle are significant even within the Canadian Shield. The low-velocity zone is a minor feature in eastern North America and is subject to pronounced regional variations. It is 30 to 50 km thick, and occurs somewhere in the depth range from 80 to 160 km. The velocity decrease is less than 0.2 km/sec.

Consideration of the absolute amplitudes indicates that the attenuation due to anelasticity is negligible for 2 hz waves in the upper 200 km along the southeastern and southwestern margins of the Canadian Shield. For compressional waves the average Q for this region is > 3000. The amplitudes also indicate that the velocity gradient is at least 2 x 10-3 both above and below the low-velocity zone, implying that the temperature gradient is < 4.8°C/km if the regions are chemically homogeneous.

In western North America, the low-velocity zone is a pronounced feature, extending to the base of the crust and having minimum velocities of 7.7 to 7.8 km/sec. Beneath the Colorado Plateau and Southern Rocky Mountains provinces, there is a rapid velocity increase of about 0.3 km/sec, similar to that observed in eastern North America, but near a depth of 100 km.

Complicated travel time curves observed on profiles with stations in both eastern and western North America can be explained in detail by a model taking into account the lateral variations in the structure of the low-velocity zone. These variations involve primarily the velocity within the zone and the depth to the top of the zone; the depth to the bottom is, for both regions, between 140 and 160 km.

The depth to the transition zone near 400 km also varies regionally, by about 30-40 km. These differences imply variations of 250 °C in the temperature or 6 % in the iron content of the mantle, if the phase transformation of olivine to the spinel structure is assumed responsible. The structural variations at this depth are not correlated with those at shallower depths, and follow no obvious simple pattern.

The computer programs used in this study are described in the Appendices. The program TTINV (Appendix IV) fits spherically symmetric earth models to observed travel time data. The method, described in Appendix III, resembles conventional least-square fitting, using partial derivatives of the travel time with respect to the model parameters to perturb an initial model. The usual ill-conditioned nature of least-squares techniques is avoided by a technique which minimizes both the travel time residuals and the model perturbations.

Spherically symmetric earth models, however, have been found inadequate to explain most of the observed travel times in this study. TVT4, a computer program that performs ray theory calculations for a laterally inhomogeneous earth model, is described in Appendix II. Appendix I gives a derivation of seismic ray theory for an arbitrarily inhomogeneous earth model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Everett interpretation of quantum mechanics is an increasingly popular alternative to the traditional Copenhagen interpretation, but there are a few major issues that prevent the widespread adoption. One of these issues is the origin of probabilities in the Everett interpretation, which this thesis will attempt to survey. The most successful resolution of the probability problem thus far is the decision-theoretic program, which attempts to frame probabilities as outcomes of rational decision making. This marks a departure from orthodox interpretations of probabilities in the physical sciences, where probabilities are thought to be objective, stemming from symmetry considerations. This thesis will attempt to offer evaluations on the decision-theoretic program.