63 resultados para 306-U1313
Resumo:
An evaluation of milk urea nitrogen (MUN) as a diagnostic of protein feeding in dairy cows was performed using mean treatment data (n = 306) from 50 production trials conducted in Finland (n = 48) and Sweden (n = 2). Data were used to assess the effects of diet composition and certain animal characteristics on MUN and to derive relationships between MUN and the efficiency of N utilization for milk production and urinary N excretion. Relationships were developed using regression analysis based on either models of fixed factors or using mixed models that account for between-experiment variations. Dietary crude protein (CP) content was the best single predictor of MUN and accounted for proportionately 0.778 of total variance [ MUN (mg/dL) = -14.2 + 0.17 x dietary CP content (g/kg dry matter)]. The proportion of variation explained by this relationship increased to 0.952 when a mixed model including the random effects of study was used, but both the intercept and slope remained unchanged. Use of rumen degradable CP concentration in excess of predicted requirements, or the ratio of dietary CP to metabolizable energy as single predictors, did not explain more of the variation in MUN (R-2 = 0.767 or 0.778, respectively) than dietary CP content. Inclusion of other dietary factors with dietary CP content in bivariate models resulted in only marginally better predictions of MUN (R-2 = 0.785 to 0.804). Closer relationships existed between MUN and dietary factors when nutrients (CP to metabolizable energy) were expressed as concentrations in the diet, rather than absolute intakes. Furthermore, both MUN and MUN secretion (g/d) provided more accurate predictions of urinary N excretion (R-2 = 0.787 and 0.835, respectively) than measurements of the efficiency of N utilization for milk production (R-2 = 0.769). It is concluded that dietary CP content is the most important nutritional factor influencing MUN, and that measurements of MUN can be utilized as a diagnostic of protein feeding in the dairy cow and used to predict urinary N excretion.
Resumo:
Cardiovascular disease (CVD) is responsible for significant morbidity and mortality in the Western and developing world. This multifactorial disease is influenced by many environmental and genetic factors. At present, public health advice involves prescribed population-based recommendations, which have been largely unsuccessful in reducing CVD risk. This is, in part, due to individual variability in response to dietary manipulations, that arises from nutrient-gene interactions (defined by the term 'nutrigenetics'). The shift towards personalized nutritional advice is a very attractive proposition, where, in principle, an individual can be given dietary advice specifically tailored to their genotype. However, the evidence-base for the impact of interactions between nutrients and fixed genetic variants on biomarkers of CVD risk is still very limited. This paper reviews the evidence for interactions between dietary fat and two common polymorphisms in the apolipoprotein E and peroxisome proliferator-activated receptor-gamma genes. Although an increased understanding of how these and other genes influence response to nutrients should facilitate the progression of personalized nutrition, the ethical issues surrounding its routine use need careful consideration.
Resumo:
An unaltered rearrangement of the original computation of a neural based predictor at the algorithmic level is introduced as a new organization. Its FPGA implementation generates circuits that are 1.7 faster than a direct implementation of the original algorithm. This faster clock rate allows to implement predictors with longer history lengths using the nearly the same hardware budget.
Resumo:
Movement disorders (MD) include a group of neurological disorders that involve neuromotor systems. MD can result in several abnormalities ranging from an inability to move, to severe constant and excessive movements. Strokes are a leading cause of disability affecting largely the older people worldwide. Traditional treatments rely on the use of physiotherapy that is partially based on theories and also heavily reliant on the therapists training and past experience. The lack of evidence to prove that one treatment is more effective than any other makes the rehabilitation of stroke patients a difficult task. UL motor re-learning and recovery levels tend to improve with intensive physiotherapy delivery. The need for conclusive evidence supporting one method over the other and the need to stimulate the stroke patient clearly suggest that traditional methods lack high motivational content, as well as objective standardised analytical methods for evaluating a patient's performance and assessment of therapy effectiveness. Despite all the advances in machine mediated therapies, there is still a need to improve therapy tools. This chapter describes a new approach to robot assisted neuro-rehabilitation for upper limb rehabilitation. Gentle/S introduces a new approach on the integration of appropriate haptic technologies to high quality virtual environments, so as to deliver challenging and meaningful therapies to people with upper limb impairment in consequence of a stroke. The described approach can enhance traditional therapy tools, provide therapy "on demand" and can present accurate objective measurements of a patient's progression. Our recent studies suggest the use of tele-presence and VR-based systems can potentially motivate patients to exercise for longer periods of time. Two identical prototypes have undergone extended clinical trials in the UK and Ireland with a cohort of 30 stroke subjects. From the lessons learnt with the Gentle/S approach, it is clear also that high quality therapy devices of this nature have a role in future delivery of stroke rehabilitation, and machine mediated therapies should be available to patient and his/her clinical team from initial hospital admission, through to long term placement in the patient's home following hospital discharge.
Resumo:
The process of how contractors take account of risk when calculating their bids for construction work is investigated based on preliminary investigations and case studies in Ghana and UK. Ghana and UK were chosen, more or less arbitrarily, for the purpose of case studies, and to test the idea that there are systematic differences between the approaches in different places. Clear differences were found in the risk pricing approaches of contractors in the two countries. The difference appeared to emanate from the professional knowledge and competence of the bid team members, company policy, corporate accountability and the business environments in which the contractors operate. Both groups of contractors take account of risk in estimates. However, risk accountability was found to be higher on the agenda in the tender process of UK contractors, documented more systematically, and assessed and managed more rigorously with input from the whole bid team. Risk accountability takes place at three levels of the tender process and is dictated strongly by market forces and company circumstances.
Resumo:
We propose a bridge between two important parallel programming paradigms: data parallelism and communicating sequential processes (CSP). Data parallel pipelined architectures obtained with the Alpha language can be embedded in a control intensive application expressed in CSP-based Handel formalism. The interface is formally defined from the semantics of the languages Alpha and Handel. This work will ease the design of compute intensive applications on FPGAs.
Resumo:
In the decade since OceanObs `99, great advances have been made in the field of ocean data dissemination. The use of Internet technologies has transformed the landscape: users can now find, evaluate and access data rapidly and securely using only a web browser. This paper describes the current state of the art in dissemination methods for ocean data, focussing particularly on ocean observations from in situ and remote sensing platforms. We discuss current efforts being made to improve the consistency of delivered data and to increase the potential for automated integration of diverse datasets. An important recent development is the adoption of open standards from the Geographic Information Systems community; we discuss the current impact of these new technologies and their future potential. We conclude that new approaches will indeed be necessary to exchange data more effectively and forge links between communities, but these approaches must be evaluated critically through practical tests, and existing ocean data exchange technologies must be used to their best advantage. Investment in key technology components, cross-community pilot projects and the enhancement of end-user software tools will be required in order to assess and demonstrate the value of any new technology.
Resumo:
The self-consistent field theory (SCFT) introduced by Helfand for diblock copolymer melts is expected to converge to the strong-segregation theory (SST) of Semenov in the asymptotic limit, $\chi N \rightarrow \infty$. However, past extrapolations of the lamellar/cylinder and cylinder/sphere phase boundaries, within the standard unit-cell approximation, have cast some doubts on whether or not this is actually true. Here we push the comparison further by extending the SCFT calculations to $\chi N = 512,000$, by accounting for exclusion zones in the coronae of the cylindrical and spherical unit cells, and by examining finite-segregation corrections to SST. In doing so, we provide the first compelling evidence that SCFT does indeed reduce to SST.
Resumo:
This paper reports on the design and manufacture of an ultra-wide (5-30µm) infrared edge filter for use in FTIR studies of the low frequency vibrational modes of metallo-proteins. We present details of the spectral design and manufacture of such a filter which meets the demanding bandwidth and transparency requirements of the application, and spectra that present the new data possible with such a filter. A design model of the filter and the materials used in its construction has been developed capable of accurately predicting spectral performance at both 300K and at the reduced operating temperature at 200K. This design model is based on the optical and semiconductor properties of a multilayer filter containing PbTe (IV-VI) layer material in combination with the dielectric dispersion of ZnSe (II-VI) deposited on a CdTe (II-VI) substrate together with the use of BaF2 (II-VII) as an antireflection layer. Comparisons between the computed spectral performance of the model and spectral measurements from manufactured coatings over a wavelength range of 4-30µm and temperature range 300-200K are presented. Finally we present the results of the FTIR measurements of Photosystem II showing the improvement in signal to noise ratio of the measurement due to using the filter, together with a light induced FTIR difference spectrum of Photosystem II.
Resumo:
There is remarkable agreement in expectations today for vastly improved ocean data management a decade from now -- capabilities that will help to bring significant benefits to ocean research and to society. Advancing data management to such a degree, however, will require cultural and policy changes that are slow to effect. The technological foundations upon which data management systems are built are certain to continue advancing rapidly in parallel. These considerations argue for adopting attitudes of pragmatism and realism when planning data management strategies. In this paper we adopt those attitudes as we outline opportunities for progress in ocean data management. We begin with a synopsis of expectations for integrated ocean data management a decade from now. We discuss factors that should be considered by those evaluating candidate “standards”. We highlight challenges and opportunities in a number of technical areas, including “Web 2.0” applications, data modeling, data discovery and metadata, real-time operational data, archival of data, biological data management and satellite data management. We discuss the importance of investments in the development of software toolkits to accelerate progress. We conclude the paper by recommending a few specific, short term targets for implementation, that we believe to be both significant and achievable, and calling for action by community leadership to effect these advancements.