943 resultados para Approximate Bayesian computation, Posterior distribution, Quantile distribution, Response time data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The country has witnessed tremendous increase in the vehicle population and increased axle loading pattern during the last decade, leaving its road network overstressed and leading to premature failure. The type of deterioration present in the pavement should be considered for determining whether it has a functional or structural deficiency, so that appropriate overlay type and design can be developed. Structural failure arises from the conditions that adversely affect the load carrying capability of the pavement structure. Inadequate thickness, cracking, distortion and disintegration cause structural deficiency. Functional deficiency arises when the pavement does not provide a smooth riding surface and comfort to the user. This can be due to poor surface friction and texture, hydro planning and splash from wheel path, rutting and excess surface distortion such as potholes, corrugation, faulting, blow up, settlement, heaves etc. Functional condition determines the level of service provided by the facility to its users at a particular time and also the Vehicle Operating Costs (VOC), thus influencing the national economy. Prediction of the pavement deterioration is helpful to assess the remaining effective service life (RSL) of the pavement structure on the basis of reduction in performance levels, and apply various alternative designs and rehabilitation strategies with a long range funding requirement for pavement preservation. In addition, they can predict the impact of treatment on the condition of the sections. The infrastructure prediction models can thus be classified into four groups, namely primary response models, structural performance models, functional performance models and damage models. The factors affecting the deterioration of the roads are very complex in nature and vary from place to place. Hence there is need to have a thorough study of the deterioration mechanism under varied climatic zones and soil conditions before arriving at a definite strategy of road improvement. Realizing the need for a detailed study involving all types of roads in the state with varying traffic and soil conditions, the present study has been attempted. This study attempts to identify the parameters that affect the performance of roads and to develop performance models suitable to Kerala conditions. A critical review of the various factors that contribute to the pavement performance has been presented based on the data collected from selected road stretches and also from five corporations of Kerala. These roads represent the urban conditions as well as National Highways, State Highways and Major District Roads in the sub urban and rural conditions. This research work is a pursuit towards a study of the road condition of Kerala with respect to varying soil, traffic and climatic conditions, periodic performance evaluation of selected roads of representative types and development of distress prediction models for roads of Kerala. In order to achieve this aim, the study is focused into 2 parts. The first part deals with the study of the pavement condition and subgrade soil properties of urban roads distributed in 5 Corporations of Kerala; namely Thiruvananthapuram, Kollam, Kochi, Thrissur and Kozhikode. From selected 44 roads, 68 homogeneous sections were studied. The data collected on the functional and structural condition of the surface include pavement distress in terms of cracks, potholes, rutting, raveling and pothole patching. The structural strength of the pavement was measured as rebound deflection using Benkelman Beam deflection studies. In order to collect the details of the pavement layers and find out the subgrade soil properties, trial pits were dug and the in-situ field density was found using the Sand Replacement Method. Laboratory investigations were carried out to find out the subgrade soil properties, soil classification, Atterberg limits, Optimum Moisture Content, Field Moisture Content and 4 days soaked CBR. The relative compaction in the field was also determined. The traffic details were also collected by conducting traffic volume count survey and axle load survey. From the data thus collected, the strength of the pavement was calculated which is a function of the layer coefficient and thickness and is represented as Structural Number (SN). This was further related to the CBR value of the soil and the Modified Structural Number (MSN) was found out. The condition of the pavement was represented in terms of the Pavement Condition Index (PCI) which is a function of the distress of the surface at the time of the investigation and calculated in the present study using deduct value method developed by U S Army Corps of Engineers. The influence of subgrade soil type and pavement condition on the relationship between MSN and rebound deflection was studied using appropriate plots for predominant types of soil and for classified value of Pavement Condition Index. The relationship will be helpful for practicing engineers to design the overlay thickness required for the pavement, without conducting the BBD test. Regression analysis using SPSS was done with various trials to find out the best fit relationship between the rebound deflection and CBR, and other soil properties for Gravel, Sand, Silt & Clay fractions. The second part of the study deals with periodic performance evaluation of selected road stretches representing National Highway (NH), State Highway (SH) and Major District Road (MDR), located in different geographical conditions and with varying traffic. 8 road sections divided into 15 homogeneous sections were selected for the study and 6 sets of continuous periodic data were collected. The periodic data collected include the functional and structural condition in terms of distress (pothole, pothole patch, cracks, rutting and raveling), skid resistance using a portable skid resistance pendulum, surface unevenness using Bump Integrator, texture depth using sand patch method and rebound deflection using Benkelman Beam. Baseline data of the study stretches were collected as one time data. Pavement history was obtained as secondary data. Pavement drainage characteristics were collected in terms of camber or cross slope using camber board (slope meter) for the carriage way and shoulders, availability of longitudinal side drain, presence of valley, terrain condition, soil moisture content, water table data, High Flood Level, rainfall data, land use and cross slope of the adjoining land. These data were used for finding out the drainage condition of the study stretches. Traffic studies were conducted, including classified volume count and axle load studies. From the field data thus collected, the progression of each parameter was plotted for all the study roads; and validated for their accuracy. Structural Number (SN) and Modified Structural Number (MSN) were calculated for the study stretches. Progression of the deflection, distress, unevenness, skid resistance and macro texture of the study roads were evaluated. Since the deterioration of the pavement is a complex phenomena contributed by all the above factors, pavement deterioration models were developed as non linear regression models, using SPSS with the periodic data collected for all the above road stretches. General models were developed for cracking progression, raveling progression, pothole progression and roughness progression using SPSS. A model for construction quality was also developed. Calibration of HDM–4 pavement deterioration models for local conditions was done using the data for Cracking, Raveling, Pothole and Roughness. Validation was done using the data collected in 2013. The application of HDM-4 to compare different maintenance and rehabilitation options were studied considering the deterioration parameters like cracking, pothole and raveling. The alternatives considered for analysis were base alternative with crack sealing and patching, overlay with 40 mm BC using ordinary bitumen, overlay with 40 mm BC using Natural Rubber Modified Bitumen and an overlay of Ultra Thin White Topping. Economic analysis of these options was done considering the Life Cycle Cost (LCC). The average speed that can be obtained by applying these options were also compared. The results were in favour of Ultra Thin White Topping over flexible pavements. Hence, Design Charts were also plotted for estimation of maximum wheel load stresses for different slab thickness under different soil conditions. The design charts showed the maximum stress for a particular slab thickness and different soil conditions incorporating different k values. These charts can be handy for a design engineer. Fuzzy rule based models developed for site specific conditions were compared with regression models developed using SPSS. The Riding Comfort Index (RCI) was calculated and correlated with unevenness to develop a relationship. Relationships were developed between Skid Number and Macro Texture of the pavement. The effort made through this research work will be helpful to highway engineers in understanding the behaviour of flexible pavements in Kerala conditions and for arriving at suitable maintenance and rehabilitation strategies. Key Words: Flexible Pavements – Performance Evaluation – Urban Roads – NH – SH and other roads – Performance Models – Deflection – Riding Comfort Index – Skid Resistance – Texture Depth – Unevenness – Ultra Thin White Topping

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Bedeutung des Dienstgüte-Managements (SLM) im Bereich von Unternehmensanwendungen steigt mit der zunehmenden Kritikalität von IT-gestützten Prozessen für den Erfolg einzelner Unternehmen. Traditionell werden zur Implementierung eines wirksamen SLMs Monitoringprozesse in hierarchischen Managementumgebungen etabliert, die einen Administrator bei der notwendigen Rekonfiguration von Systemen unterstützen. Auf aktuelle, hochdynamische Softwarearchitekturen sind diese hierarchischen Ansätze jedoch nur sehr eingeschränkt anwendbar. Ein Beispiel dafür sind dienstorientierte Architekturen (SOA), bei denen die Geschäftsfunktionalität durch das Zusammenspiel einzelner, voneinander unabhängiger Dienste auf Basis deskriptiver Workflow-Beschreibungen modelliert wird. Dadurch ergibt sich eine hohe Laufzeitdynamik der gesamten Architektur. Für das SLM ist insbesondere die dezentrale Struktur einer SOA mit unterschiedlichen administrativen Zuständigkeiten für einzelne Teilsysteme problematisch, da regelnde Eingriffe zum einen durch die Kapselung der Implementierung einzelner Dienste und zum anderen durch das Fehlen einer zentralen Kontrollinstanz nur sehr eingeschränkt möglich sind. Die vorliegende Arbeit definiert die Architektur eines SLM-Systems für SOA-Umgebungen, in dem autonome Management-Komponenten kooperieren, um übergeordnete Dienstgüteziele zu erfüllen: Mithilfe von Selbst-Management-Technologien wird zunächst eine Automatisierung des Dienstgüte-Managements auf Ebene einzelner Dienste erreicht. Die autonomen Management-Komponenten dieser Dienste können dann mithilfe von Selbstorganisationsmechanismen übergreifende Ziele zur Optimierung von Dienstgüteverhalten und Ressourcennutzung verfolgen. Für das SLM auf Ebene von SOA Workflows müssen temporär dienstübergreifende Kooperationen zur Erfüllung von Dienstgüteanforderungen etabliert werden, die sich damit auch über mehrere administrative Domänen erstrecken können. Eine solche zeitlich begrenzte Kooperation autonomer Teilsysteme kann sinnvoll nur dezentral erfolgen, da die jeweiligen Kooperationspartner im Vorfeld nicht bekannt sind und – je nach Lebensdauer einzelner Workflows – zur Laufzeit beteiligte Komponenten ausgetauscht werden können. In der Arbeit wird ein Verfahren zur Koordination autonomer Management-Komponenten mit dem Ziel der Optimierung von Antwortzeiten auf Workflow-Ebene entwickelt: Management-Komponenten können durch Übertragung von Antwortzeitanteilen untereinander ihre individuellen Ziele straffen oder lockern, ohne dass das Gesamtantwortzeitziel dadurch verändert wird. Die Übertragung von Antwortzeitanteilen wird mithilfe eines Auktionsverfahrens realisiert. Technische Grundlage der Kooperation bildet ein Gruppenkommunikationsmechanismus. Weiterhin werden in Bezug auf die Nutzung geteilter, virtualisierter Ressourcen konkurrierende Dienste entsprechend geschäftlicher Ziele priorisiert. Im Rahmen der praktischen Umsetzung wird die Realisierung zentraler Architekturelemente und der entwickelten Verfahren zur Selbstorganisation beispielhaft für das SLM konkreter Komponenten vorgestellt. Zur Untersuchung der Management-Kooperation in größeren Szenarien wird ein hybrider Simulationsansatz verwendet. Im Rahmen der Evaluation werden Untersuchungen zur Skalierbarkeit des Ansatzes durchgeführt. Schwerpunkt ist hierbei die Betrachtung eines Systems aus kooperierenden Management-Komponenten, insbesondere im Hinblick auf den Kommunikationsaufwand. Die Evaluation zeigt, dass ein dienstübergreifendes, autonomes Performance-Management in SOA-Umgebungen möglich ist. Die Ergebnisse legen nahe, dass der entwickelte Ansatz auch in großen Umgebungen erfolgreich angewendet werden kann.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web services from different partners can be combined to applications that realize a more complex business goal. Such applications built as Web service compositions define how interactions between Web services take place in order to implement the business logic. Web service compositions not only have to provide the desired functionality but also have to comply with certain Quality of Service (QoS) levels. Maximizing the users' satisfaction, also reflected as Quality of Experience (QoE), is a primary goal to be achieved in a Service-Oriented Architecture (SOA). Unfortunately, in a dynamic environment like SOA unforeseen situations might appear like services not being available or not responding in the desired time frame. In such situations, appropriate actions need to be triggered in order to avoid the violation of QoS and QoE constraints. In this thesis, proper solutions are developed to manage Web services and Web service compositions with regard to QoS and QoE requirements. The Business Process Rules Language (BPRules) was developed to manage Web service compositions when undesired QoS or QoE values are detected. BPRules provides a rich set of management actions that may be triggered for controlling the service composition and for improving its quality behavior. Regarding the quality properties, BPRules allows to distinguish between the QoS values as they are promised by the service providers, QoE values that were assigned by end-users, the monitored QoS as measured by our BPR framework, and the predicted QoS and QoE values. BPRules facilitates the specification of certain user groups characterized by different context properties and allows triggering a personalized, context-aware service selection tailored for the specified user groups. In a service market where a multitude of services with the same functionality and different quality values are available, the right services need to be selected for realizing the service composition. We developed new and efficient heuristic algorithms that are applied to choose high quality services for the composition. BPRules offers the possibility to integrate multiple service selection algorithms. The selection algorithms are applicable also for non-linear objective functions and constraints. The BPR framework includes new approaches for context-aware service selection and quality property predictions. We consider the location information of users and services as context dimension for the prediction of response time and throughput. The BPR framework combines all new features and contributions to a comprehensive management solution. Furthermore, it facilitates flexible monitoring of QoS properties without having to modify the description of the service composition. We show how the different modules of the BPR framework work together in order to execute the management rules. We evaluate how our selection algorithms outperform a genetic algorithm from related research. The evaluation reveals how context data can be used for a personalized prediction of response time and throughput.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabajo de grado pretende dar a conocer como se ha optimizado el tiempo de respuesta de una empresa de ambulancias de Bogotá y como esto ha colaborado en que los servicios de urgencia de la ciudad hayan mejorado su calidad y su oferta. El nombre de la empresa de ambulancias es Transporte Ambulatorio Medico Ltda. y se hace una breve reseña de su historia dentro del documento. Para lograr demostrar si en realidad ha ocurrido una mejora se utilizo como base un estudio previo realizado en la universidad de los andes versus un muestre actual que los autores de este trabajo realizaron. Se utilizaron principios de teoría de colas y herramientas estadísticas para colaborar con las conclusiones del presente documento Los autores también proponen una posible solución para mejorar aun más el tiempo de respuesta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción: El programa de Tutores Pares-TP es una iniciativa creada en la EMCS de la Universidad del Rosario que brinda acompañamiento académico a través de estudiantes-tutores a pares menos avanzados. Éste entrega a sus Tutores sistemáticamente, herramientas para desempeñarse armónicamente en el ejercicio de guía y provee habilidades para el manejo del saber. Este estudio busca explorar posibles ´impactos´ generados tras la participación de estudiantes de medicina como TPs dentro de un programa estructurado. Materiales y métodos: Estudio cualitativo que involucró la construcción y aplicación de encuestas a grupos focales –TPs, Docentes y Familiares- creadas a partir de seis ejes/categorías que enmarcan al médico ideal. Las respuestas obtenidas de preguntas cerradas –en escala valorativa- y de naturaleza abierta fueron sometidas a análisis descriptivo –modas- y triangulación. Resultados: 41 tutores, agrupados en 4 grupos de análisis, evidenciaron un impacto general positivo con predominio en habilidades interpersonales (60%,65%,66%,45%, respectivamente), funciones/actividades basadas en la práctica y mejoramiento (57%,67%,60%,45%) y la forma como se emplean los conocimientos (47%,70%,67%,48%). Ocho docentes encuestados consideraron relevante el impacto del programa en habilidades interpersonales-(49%), conocimientos-(42%) y la interacción con colegas-(38%). En los padres de familia hay consenso en el cambio en habilidades interpersonales, funciones basadas en la práctica y mejoramiento y en actitudes-valores ético/morales. Dichos resultados están en paralelo con las observaciones plasmadas en las preguntas abiertas. Conclusiones: Se evidenció un impacto general positivo en la formación y desempeño profesional tras la participación como TPs dentro del programa; hallazgo que soporta aquellos publicados de experiencias académicas similares.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción: Se ha determinado que las jornadas laborales, generan un deterioro cognoscitivo y funcional en las personas, con la consecuente afectación en los servicios de salud, al ser una de las disciplinas que más se encuentran en riesgo de cometer errores durante sus procesos de atención. Es por esto que en el presente estudio se pretendió evaluar el impacto de la jornada laboral en la capacidad de atención de los médicos de urgencias. Metodología: Se realizó un estudio transversal aplicando el Psychomotor Vigilance Test, el cual evalúa la capacidad de atención de las personas después de realizar diferentes actividades según el tiempo de respuesta en milisegundos. Se tomó una muestra de la población del personal médico de urgencias de la Fundación Santa Fé de Bogotá, estableciendo una comparación del mismo paciente en los diferentes turnos. Resultados: En el presente estudio se documentó un tiempo de respuesta promedio al inicio de la jornada diurna de 436,6 ms (IC95% 401-477) y al final de 443,1 ms (IC95% 388-484). Con respecto a la jornada nocturna se documentó un tiempo de respuesta promedio inicial de 422,8 ms (IC95% 403-457) y al final de 467,44 ms (IC95% 423-501). Discusión: Encontramos diferencias estadísticamente significativas en cuanto al tiempo de respuesta entre la jornada diurna y nocturna. Por lo tanto es recomendable crear políticas de Estado que gestionen el horario laboral del personal de salud para que prime la seguridad y la calidad de atención en el paciente, evitando al máximo cualquier posibilidad de error médico

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Avanzalogistic corresponde a una plataforma multilateral que hace parte de las nuevas tecnologías de información y servicios, por lo cual se considera como una nueva opción para la ciudad de Bogotá, enfocada en dar solución al transporte de mercancía para las pequeñas y medianas empresas de un sector de la ciudad. Teniendo como diferenciación, la asistencia y atención personalizada para este tipo de clientes, donde por medio de la configuración de carga en tiempo real y con la facilidad virtual, podrán recibir el mejor servicio, en el menor tiempo posible y con los mejores precios del mercado. Considerando las necesidades de confiabilidad, seguridad, rapidez y buena atención que exige nuestro segmento objetivo, Avanzalogistic propone un servicio con calidad integral, donde por medio del desempeño enfocado al cliente y el cubrimiento de las necesidades del mismo será potencialmente reconocido como una de las mejores opciones del mercado para las empresas que soliciten transportar su carga de manera rápida y segura, disminuyendo el tiempo de respuesta de 24 horas (tiempo actual de respuesta de la competencia) a instantáneamente (tiempo por Avanzalogistic). Teniendo en cuenta lo anterior, en el proyecto se decide realizar una inversión inicial de $ 135 627 200 pesos colombianos, alcanzando su punto de equilibrio a partir del mes seis del primer año y teniendo un periodo de recuperación representado a partir del segundo año. Avanzalogistic es un proyecto que presenta una rentabilidad del 68,51%, justificada por la baja inversión y los bajos costos de operación, teniendo como principal ventaja la tercerización de los vehículos, además de la gran oportunidad de tener un mercado potencial lo suficientemente amplio que evidencia un ingreso que justifica la utilidad en cada servicio prestado. Siendo así un proyecto no solo viable, sino también con posibilidad de crecimiento y atractivo para invertir.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses a study to determine the relation between sensation level and response time to acoustic stimuli.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper reports an interactive tool for calibrating a camera, suitable for use in outdoor scenes. The motivation for the tool was the need to obtain an approximate calibration for images taken with no explicit calibration data. Such images are frequently presented to research laboratories, especially in surveillance applications, with a request to demonstrate algorithms. The method decomposes the calibration parameters into intuitively simple components, and relies on the operator interactively adjusting the parameter settings to achieve a visually acceptable agreement between a rectilinear calibration model and his own perception of the scene. Using the tool, we have been able to calibrate images of unknown scenes, taken with unknown cameras, in a matter of minutes. The standard of calibration has proved to be sufficient for model-based pose recovery and tracking of vehicles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Street-level mean flow and turbulence govern the dispersion of gases away from their sources in urban areas. A suitable reference measurement in the driving flow above the urban canopy is needed to both understand and model complex street-level flow for pollutant dispersion or emergency response purposes. In vegetation canopies, a reference at mean canopy height is often used, but it is unclear whether this is suitable for urban canopies. This paper presents an evaluation of the quality of reference measurements at both roof-top (height = H) and at height z = 9H = 190 m, and their ability to explain mean and turbulent variations of street-level flow. Fast response wind data were measured at street canyon and reference sites during the six-week long DAPPLE project field campaign in spring 2004, in central London, UK, and an averaging time of 10 min was used to distinguish recirculation-type mean flow patterns from turbulence. Flow distortion at each reference site was assessed by considering turbulence intensity and streamline deflection. Then each reference was used as the dependent variable in the model of Dobre et al. (2005) which decomposes street-level flow into channelling and recirculating components. The high reference explained more of the variability of the mean flow. Coupling of turbulent kinetic energy was also stronger between street-level and the high reference flow rather than the roof-top. This coupling was weaker when overnight flow was stratified, and turbulence was suppressed at the high reference site. However, such events were rare (<1% of data) over the six-week long period. The potential usefulness of a centralised, high reference site in London was thus demonstrated with application to emergency response and air quality modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, compliant actuators are developed by coupling braided structures and polymer gels, able to produce work by controlled gel swelling in the presence of water. A number of aspects related to the engineering of gel actuators were studied, including gel selection, modelling and experimentation of constant force and constant displacement behaviour, and response time. The actuator was intended for use as vibration neutralizer: with this aim, generation of a force of 10 N in a time not exceeding a second was needed. Results were promising in terms of force generation, although response time was still longer than required. In addition, the easiest way to obtain the reversibility of the effect is still under discussion: possible routes for improvement are suggested and will be the object of future work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the success of studies attempting to integrate remotely sensed data and flood modelling and the need to provide near-real time data routinely on a global scale as well as setting up online data archives, there is to date a lack of spatially and temporally distributed hydraulic parameters to support ongoing efforts in modelling. Therefore, the objective of this project is to provide a global evaluation and benchmark data set of floodplain water stages with uncertainties and assimilation in a large scale flood model using space-borne radar imagery. An algorithm is developed for automated retrieval of water stages with uncertainties from a sequence of radar imagery and data are assimilated in a flood model using the Tewkesbury 2007 flood event as a feasibility study. The retrieval method that we employ is based on possibility theory which is an extension of fuzzy sets and that encompasses probability theory. In our case we first attempt to identify main sources of uncertainty in the retrieval of water stages from radar imagery for which we define physically meaningful ranges of parameter values. Possibilities of values are then computed for each parameter using a triangular ‘membership’ function. This procedure allows the computation of possible values of water stages at maximum flood extents along a river at many different locations. At a later stage in the project these data are then used in assimilation, calibration or validation of a flood model. The application is subsequently extended to a global scale using wide swath radar imagery and a simple global flood forecasting model thereby providing improved river discharge estimates to update the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Milk solids yield in modern dairy cows has increased linearly over the last 50 years, stressing the need for maximal dietary energy intake to allow genetic potential for milk energy yield to be realized with minimal negative effects on health and reproduction. Feeding supplemental starch is a common approach for increasing the energy density of the ration and supplying carbon for meeting the substantial glucose requirement of the higher yielding cow. In this regard, it is a long held belief that feeding starch in forms that increase digestion in the small intestine and glucose absorption will benefit the cow in terms of energetic efficiency and production response, but data supporting this dogma are equivocal. This review will consider the impact of supplemental starch and site of starch digestion on metabolic and production responses of lactating dairy cows, including effects on feed intake, milk yield and composition, nutrient partitioning, the capacity of the small intestine for starch digestion, and nutrient absorption and metabolism by the splanchnic tissues (the portal-drained viscera and liver). Whilst there appears to be considerable capacity for starch digestion and glucose absorption in the lactating dairy cow, numerous strategic studies implementing postruminal starch or glucose infusions have observed increases in milk yield, but decreased milk fat concentration such that there is little effect on milk energy yield, even in early lactation. Measurements of energy balance confirm that the majority of the supplemental energy arising from postruminal starch digestion is used with high efficiency to support body adipose and protein retention, even in early lactation. These responses may be mediated by changes in insulin status, and be beneficial to the cow in terms of reproductive success and well-being. However, shifting starch digestion from the rumen impacts the nitrogen economy of the cow as well by shifting the microbial protein gained from starch digestion from potentially absorbable protein to endogenous faecal loss.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to apply and compare two time-domain analysis procedures in the determination of oxygen uptake (VO2) kinetics in response to a pseudorandom binary sequence (PRBS) exercise test. PRBS exercise tests have typically been analysed in the frequency domain. However, the complex interpretation of frequency responses may have limited the application of this procedure in both sporting and clinical contexts, where a single time measurement would facilitate subject comparison. The relative potential of both a mean response time (MRT) and a peak cross-correlation time (PCCT) was investigated. This study was divided into two parts: a test-retest reliability study (part A), in which 10 healthy male subjects completed two identical PRBS exercise tests, and a comparison of the VO2 kinetics of 12 elite endurance runners (ER) and 12 elite sprinters (SR; part B). In part A, 95% limits of agreement were calculated for comparison between MRT and PCCT. The results of part A showed no significant difference between test and retest as assessed by MRT [mean (SD) 42.2 (4.2) s and 43.8 (6.9) s] or by PCCT [21.8 (3.7) s and 22.7 (4.5) s]. Measurement error (%) was lower for MRT in comparison with PCCT (16% and 25%, respectively). In part B of the study, the VO2 kinetics of ER were significantly faster than those of SR, as assessed by MRT [33.4 (3.4) s and 39.9 (7.1) s, respectively; P<0.01] and PCCT [20.9 (3.8) s and 24.8 (4.5) s; P < 0.05]. It is possible that either analysis procedure could provide a single test measurement Of VO2 kinetics; however, the greater reliability of the MRT data suggests that this method has more potential for development in the assessment Of VO2 kinetics by PRBS exercise testing.