875 resultados para Value-based selling
Resumo:
Three groups of steer calves totaling 480 head were sorted into smaller and larger frame sizes, and those groups were sorted into groups with more and less backfat. There was no difference in age among the four sorted groups. The larger steers and steers with less fat had faster rates of gain and tended to have superior feed efficiencies. Steers with more initial fat were fed fewer days. The larger framed steers and steers with less fat had heavier carcasses, less carcass backfat, more yield grade 1 carcasses and a lower percentage of Choice carcasses, but they also had greater value per carcass when evaluated using a grid paying premiums for quality and yield grades. The greatest profit to the feedyard was realized from the smaller framed steers and those with less initial backfat. For similar profit it was calculated that the larger steers should have been discounted as feeders $3.50 per hundred compared with the smaller steers and the steers with more fat discounted $5.00 per hundred compared with those having less initial fat. The results of this study suggest that sorting based on initial fat thickness may have more potential for enhancing the value of finished cattle than sorting based on frame score.
Resumo:
Corn steep liquor is a liquid by-product containing condensed steep water and condensed distillers solubles from a wet corn milling plant. Finishing steers weighing nine hundred and seventy-five pounds were fed cornbased finishing diets containing 0%, 6%, or 12% corn steep liquor for 84 days. Feeding corn steep liquor did not affect performance of the steers or carcass characteristics. Based on value of feeds replaced in the diet, steep liquor had a value of $55 to $60/ton (50% dry matter) when used to replace corn and supplemental protein in a corn-based finishing diet.
Resumo:
For 126 days, 850 lb. steers were fed diets of corn, corn silage, and ground hay containing either 0%, 4%, or 8% wet distillers solubles obtained from an Iowa dry mill ethanol plant. Addition of distillers solubles resulted in a linear decrease in feed consumption. Gains were increased 3.2% and decreased 6.4% by feeding 4% and 8% distillers solubles, respectively. Compared to the control diet, feed required per pound of gain was reduced 5% by low levels of distillers solubles and 1.5% by high levels. Feeding distillers solubles had no effect on carcass measurements. It was concluded that wet distillers solubles has value as a feed for cattle and can replace a portion of corn grain and supplemental nitrogen in a corn-based finishing diet for beef cattle. The decreased performance of steers fed the 8% level suggests that there might be a maximum amount of wet distiller solubles that can be fed to finishing cattle.
Resumo:
Source verification and pooling of feeder cattle into larger lots resulted in higher selling prices compared to more typical sales at a southern Iowa auction market. After higher prices due to larger lot sizes were accounted for, cattle that received a specified management program and were source verified as to origin received additional price premiums. The data do not distinguish between the value of the specific management program and the value of the source verification process. However, cow–calf producers participating in the program took home more money.
Resumo:
Source verification and pooling of feeder cattle into larger lots resulted in higher selling prices compared with more typical sales at a southern Iowa auction market. After accounting for higher prices due to larger lot sizes, cattle that received a specified management program and were source verified as to origin received additional price premiums. The data do not distinguish between the value of the specific management program and the value of the source verification process. However, cow-calf producers participating in the program took home more money.
Resumo:
AIM To examine the association of alcohol-related mortality and other causes of death with neighbourhood density of alcohol-selling outlets for on-site consumption. DESIGN, SETTING AND PARTICIPANTS Longitudinal study of the adult Swiss population (n = 4 376 873) based on census records linked to mortality data from 2001 to 2008. MEASUREMENTS Sex-specific hazard ratios (HR) for death and 95% confidence intervals (95%CI) were calculated using Cox models adjusting for age, educational level, occupational attainment, marital status and other potential confounders. The density of alcohol-selling outlets within 1000 m of the residence was calculated using geocodes of outlets and residences. FINDINGS Compared with >17 outlets within 1000 m the HR for alcohol-related mortality in men was 0.95 (95%CI: 0.89-1.02) for 8-17 outlets, 0.84 (95%CI: 0.77-0.90) for 3-7 outlets, 0.76 (95%CI: 0.68-0.83) for 1-2 outlets and 0.60 (95%CI: 0.51-0.72) for 0 outlets. The gradient in women was somewhat steeper, with a HR comparing 0 with >17 outlets of 0.39 (95%CI: 0.26-0.60). Mortality from mental and behavioural causes and lung cancer were also associated with density of alcohol-selling outlets: HRs comparing 0 outlets with >17 outlets were 0.64 (95%CI: 0.52-0.79) and 0.79 (95%CI: 0.72-0.88), respectively, in men and 0.46 (95%CI: 0.27-0.78) and 0.63 (95%CI: 0.52-0.77), respectively, in women. There were weak associations in the same direction with all-cause mortality in men but not in women. CONCLUSIONS In Switzerland, alcohol-related mortality is associated with the density of outlets around the place of residence. Community-level interventions to reduce alcohol outlet density may usefully complement existing interventions.
Resumo:
This paper contrasts the decision-usefulness of prototype accounting regimes based on perfect accounting for value, i.e. ideal value accounting (IVA), and perfect matching of cost, i.e. ideal cost accounting (ICA). The regimes are analyzed in the context of a firm with overlapping capacity investments where projects earn excess returns and residual income is utilized as performance indicator. Provided that IVA and ICA systematically differ based on the criterion of unconditional conservatism, we assess their respective decision-usefulness for different valuation- and stewardship-scenarios. Assuming that addressees solely observe current accounting data of the firm, ICA provides information which is useful for valuation and stewardship without reservation whereas IVA entails problems under specific assumptions.
Resumo:
BACKGROUND -The value of standard two-dimensional transthoracic echocardiographic (TTE) parameters for risk stratification in patients with arrhythmogenic right ventricular cardiomyopathy/dysplasia (ARVC/D) is controversial. METHODS AND RESULTS -We investigated the impact of right ventricular fractional area change (FAC) and tricuspid annulus plane systolic excursion (TAPSE) for prediction of major adverse cardiovascular events (MACE) defined as the occurrence of cardiac death, heart transplantation, survived sudden cardiac death, ventricular fibrillation, sustained ventricular tachycardia or arrhythmogenic syncope. Among 70 patients who fulfilled the 2010 ARVC/D Task Force Criteria and underwent baseline TTE, 37 (53%) patients experienced a MACE during a median follow-up period of 5.3 (IQR 1.8-9.8) years. Average values for FAC, TAPSE, and TAPSE indexed to body surface area (BSA) decreased over time (p=0.03 for FAC, p=0.03 for TAPSE and p=0.01 for TAPSE/BSA, each vs. baseline). In contrast, median right ventricular end-diastolic area (RVEDA) increased (p=0.001 vs. baseline). Based on the results of Kaplan-Meier estimates, the time between baseline TTE and experiencing MACE was significantly shorter for patients with FAC <23% (p<0.001), TAPSE <17mm (p=0.02) or right atrial (RA) short axis/BSA ≥25mm/m(2) (p=0.04) at baseline. A reduced FAC constituted the strongest predictor of MACE (hazard ratio 1.08 per 1% decrease; 95% confidence interval 1.04-1.12; p<0.001) on bivariable analysis. CONCLUSIONS -This long-term observational study indicates that TAPSE and dilation of right-sided cardiac chambers are associated with an increased risk for MACE in ARVC/D patients with advanced disease and a high risk for adverse events. However, FAC is the strongest echocardiographic predictor of adverse outcome in these patients. Our data advocate a role for TTE in risk stratification in patients with ARVC/D, although our results may not be generalizable to lower risk ARVC/D cohorts.
Resumo:
BACKGROUND The objective of this study was to compare transtelephonic ECG every 2 days and serial 7-day Holter as two methods of follow-up after atrial fibrillation (AF) catheter ablation for the judgment of ablation success. Patients with highly symptomatic AF are increasingly treated with catheter ablation. Several methods of follow-up have been described, and judgment on ablation success often relies on patients' symptoms. However, the optimal follow-up strategy objectively detecting most of the AF recurrences is yet unclear. METHODS Thirty patients with highly symptomatic AF were selected for circumferential pulmonary vein ablation. During follow-up, a transtelephonic ECG was transmitted once every 2 days for half a year. Additionally, a 7-day Holter was recorded preablation, after ablation, after 3 and 6 months, respectively. With both, procedures symptoms and actual rhythm were correlated thoroughly. RESULTS A total of 2,600 transtelephonic ECGs were collected with 216 of them showing AF. 25% of those episodes were asymptomatic. On a Kaplan-Meier analysis 45% of the patients with paroxysmal AF were still in continuous SR after 6 months. Simulating a follow-up based on symptomatic recurrences only, that number would have increased to 70%. Using serial 7-day ECG, 113 Holter with over 18,900 hours of ECG recording were acquired. After 6 months the percentage of patients classified as free from AF was 50%. Of the patients with recurrences, 30-40% were completely asymptomatic. The percentage of asymptomatic AF episodes stepwise increased from 11% prior ablation to 53% 6 months after. CONCLUSIONS The success rate in terms of freedom from AF was 70% on a symptom-only-based follow-up; using serial 7-day Holter it decreased to 50% and on transtelephonic monitoring to 45%, respectively. Transtelephonic ECG and serial 7-day Holter were equally effective to objectively determine long-term success and to detect asymptomatic patients.
Resumo:
Divalent metal ion transporter 1 (DMT1) is a proton-coupled Fe(2+) transporter that is essential for iron uptake in enterocytes and for transferrin-associated endosomal iron transport in many other cell types. DMT1 dysfunction is associated with several diseases such as iron overload disorders and neurodegenerative diseases. The main objective of the present work is to develop and validate a fluorescence-based screening assay for DMT1 modulators. We found that Fe(2+) or Cd(2+) influx could be reliably monitored in calcium 5-loaded DMT1-expressing HEK293 cells using the FLIPR Tetra fluorescence microplate reader. DMT1-mediated metal transport shows saturation kinetics depending on the extracellular substrate concentration, with a K0.5 value of 1.4 µM and 3.5 µM for Fe(2+) and Cd(2+), respectively. In addition, Cd(2+) was used as a substrate for DMT1, and we find a Ki value of 2.1 µM for a compound (2-(3-carbamimidoylsulfanylmethyl-benzyl)-isothiourea) belonging to the benzylisothioureas family, which has been identified as a DMT1 inhibitor. The optimized screening method using this compound as a reference demonstrated a Z' factor of 0.51. In summary, we developed and validated a sensitive and reproducible cell-based fluorescence assay suitable for the identification of compounds that specifically modulate DMT1 transport activity.
Resumo:
In cattle, at least 39 variants of the 4 casein proteins (α(S1)-, β-, α(S2)- and κ-casein) have been described to date. Many of these variants are known to affect milk-production traits, cheese-processing properties, and the nutritive value of milk. They also provide valuable information for phylogenetic studies. So far, the majority of studies exploring the genetic variability of bovine caseins considered European taurine cattle breeds and were carried out at the protein level by electrophoretic techniques. This only allows the identification of variants that, due to amino acid exchanges, differ in their electric charge, molecular weight, or isoelectric point. In this study, the open reading frames of the casein genes CSN1S1, CSN2, CSN1S2, and CSN3 of 356 animals belonging to 14 taurine and 3 indicine cattle breeds were sequenced. With this approach, we identified 23 alleles, including 5 new DNA sequence variants, with a predicted effect on the protein sequence. The new variants were only found in indicine breeds and in one local Iranian breed, which has been phenotypically classified as a taurine breed. A multidimensional scaling approach based on available SNP chip data, however, revealed an admixture of taurine and indicine populations in this breed as well as in the local Iranian breed Golpayegani. Specific indicine casein alleles were also identified in a few European taurine breeds, indicating the introgression of indicine breeds into these populations. This study shows the existence of substantial undiscovered genetic variability of bovine casein loci, especially in indicine cattle breeds. The identification of new variants is a valuable tool for phylogenetic studies and investigations into the evolution of the milk protein genes.
Resumo:
BACKGROUND Prophylactic measures are key components of dairy herd mastitis control programs, but some are only relevant in specific housing systems. To assess the association between management practices and mastitis incidence, data collected in 2011 by a survey among 979 randomly selected Swiss dairy farms, and information from the regular test day recordings from 680 of these farms was analyzed. RESULTS The median incidence of farmer-reported clinical mastitis (ICM) was 11.6 (mean 14.7) cases per 100 cows per year. The median annual proportion of milk samples with a composite somatic cell count (PSCC) above 200,000 cells/ml was 16.1 (mean 17.3) %. A multivariable negative binomial regression model was fitted for each of the mastitis indicators for farms with tie-stall and free-stall housing systems separately to study the effect of other (than housing system) management practices on the ICM and PSCC events (above 200,000 cells/ml). The results differed substantially by housing system and outcome. In tie-stall systems, clinical mastitis incidence was mainly affected by region (mountainous production zone; incidence rate ratio (IRR) = 0.73), the dairy herd replacement system (1.27) and farmers age (0.81). The proportion of high SCC was mainly associated with dry cow udder controls (IRR = 0.67), clean bedding material at calving (IRR = 1.72), using total merit values to select bulls (IRR = 1.57) and body condition scoring (IRR = 0.74). In free-stall systems, the IRR for clinical mastitis was mainly associated with stall climate/temperature (IRR = 1.65), comfort mats as resting surface (IRR = 0.75) and when no feed analysis was carried out (IRR = 1.18). The proportion of high SSC was only associated with hand and arm cleaning after calving (IRR = 0.81) and beef producing value to select bulls (IRR = 0.66). CONCLUSIONS There were substantial differences in identified risk factors in the four models. Some of the factors were in agreement with the reported literature while others were not. This highlights the multifactorial nature of the disease and the differences in the risks for both mastitis manifestations. Attempting to understand these multifactorial associations for mastitis within larger management groups continues to play an important role in mastitis control programs.
Resumo:
Traditional methods do not actually measure peoples’ risk attitude naturally and precisely. Therefore, a fuzzy risk attitude classification method is developed. Since the prospect theory is usually considered as an effective model of decision making, the personalized parameters in prospect theory are firstly fuzzified to distinguish people with different risk attitudes, and then a fuzzy classification database schema is applied to calculate the exact value of risk value attitude and risk be- havior attitude. Finally, by applying a two-hierarchical clas- sification model, the precise value of synthetical risk attitude can be acquired.
Resumo:
The use of biomarkers to infer drug response in patients is being actively pursued, yet significant challenges with this approach, including the complicated interconnection of pathways, have limited its application. Direct empirical testing of tumor sensitivity would arguably provide a more reliable predictive value, although it has garnered little attention largely due to the technical difficulties associated with this approach. We hypothesize that the application of recently developed microtechnologies, coupled to more complex 3-dimensional cell cultures, could provide a model to address some of these issues. As a proof of concept, we developed a microfluidic device where spheroids of the serous epithelial ovarian cancer cell line TOV112D are entrapped and assayed for their chemoresponse to carboplatin and paclitaxel, two therapeutic agents routinely used for the treatment of ovarian cancer. In order to index the chemoresponse, we analyzed the spatiotemporal evolution of the mortality fraction, as judged by vital dyes and confocal microscopy, within spheroids subjected to different drug concentrations and treatment durations inside the microfluidic device. To reflect microenvironment effects, we tested the effect of exogenous extracellular matrix and serum supplementation during spheroid formation on their chemotherapeutic response. Spheroids displayed augmented chemoresistance in comparison to monolayer culturing. This resistance was further increased by the simultaneous presence of both extracellular matrix and high serum concentration during spheroid formation. Following exposure to chemotherapeutics, cell death profiles were not uniform throughout the spheroid. The highest cell death fraction was found at the center of the spheroid and the lowest at the periphery. Collectively, the results demonstrate the validity of the approach, and provide the basis for further investigation of chemotherapeutic responses in ovarian cancer using microfluidics technology. In the future, such microdevices could provide the framework to assay drug sensitivity in a timeframe suitable for clinical decision making.
Resumo:
Traditionally, desertification research has focused on degradation assessments, whereas prevention and mitigation strategies have not sufficiently been emphasised, although the concept of sustainable land management (SLM) is increasingly being acknowledged. SLM strategies are interventions at the local to regional scale aiming at increasing productivity, protecting the natural resource base, and improving livelihoods. The global WOCAT initiative and its partners have developed harmonized frameworks to compile, evaluate and analyse the impact of SLM practices around the globe. Recent studies within the EU research project DESIRE developed a methodological framework that combines a collective learning and decision-making approach with use of best practices from the WOCAT database. In-depth assessment of 30 technologies and 8 approaches from 17 desertification sites enabled an evaluation of how SLM addresses prevalent dryland threats such as water scarcity, soil and vegetation degradation, low production, climate change, resource use conflicts and migration. Among the impacts attributed to the documented technologies, those mentioned most were diversified and enhanced production and better management of water and soil degradation, whether through water harvesting, improving soil moisture, or reducing runoff. Water harvesting offers under-exploited opportunities for the drylands and the predominantly rainfed farming systems of the developing world. Recently compiled guidelines introduce the concepts behind water harvesting and propose a harmonised classification system, followed by an assessment of suitability, adoption and up-scaling of practices. Case studies go from large-scale floodwater spreading that make alluvial plains cultivable, to systems that boost cereal production in small farms, as well as practices that collect and store water from household compounds. Once contextualized and set in appropriate institutional frameworks, they can form part of an overall adaptation strategy for land users. More field research is needed to reinforce expert assessments of SLM impacts and provide the necessary evidence-based rationale for investing in SLM. This includes developing methods to quantify and value ecosystem services, both on-site and off-site, and assess the resilience of SLM practices, as currently aimed at within the new EU CASCADE project.