10 resultados para Rämä, Iivari: Jääkäripapin pitkä marssi
em CentAUR: Central Archive University of Reading - UK
Resumo:
A study was conducted to estimate variation among laboratories and between manual and automated techniques of measuring pressure on the resulting gas production profiles (GPP). Eight feeds (molassed sugarbeet feed, grass silage, maize silage, soyabean hulls, maize gluten feed, whole crop wheat silage, wheat, glucose) were milled to pass a I mm screen and sent to three laboratories (ADAS Nutritional Sciences Research Unit, UK; Institute of Grassland and Environmental Research (IGER), UK; Wageningen University, The Netherlands). Each laboratory measured GPP over 144 h using standardised procedures with manual pressure transducers (MPT) and automated pressure systems (APS). The APS at ADAS used a pressure transducer and bottles in a shaking water bath, while the APS at Wageningen and IGER used a pressure sensor and bottles held in a stationary rack. Apparent dry matter degradability (ADDM) was estimated at the end of the incubation. GPP were fitted to a modified Michaelis-Menten model assuming a single phase of gas production, and GPP were described in terms of the asymptotic volume of gas produced (A), the time to half A (B), the time of maximum gas production rate (t(RM) (gas)) and maximum gas production rate (R-M (gas)). There were effects (P<0.001) of substrate on all parameters. However, MPT produced more (P<0.001) gas, but with longer (P<0.001) B and t(RM gas) (P<0.05) and lower (P<0.001) R-M gas compared to APS. There was no difference between apparatus in ADDM estimates. Interactions occurred between substrate and apparatus, substrate and laboratory, and laboratory and apparatus. However, when mean values for MPT were regressed from the individual laboratories, relationships were good (i.e., adjusted R-2 = 0.827 or higher). Good relationships were also observed with APS, although they were weaker than for MPT (i.e., adjusted R-2 = 0.723 or higher). The relationships between mean MPT and mean APS data were also good (i.e., adjusted R 2 = 0. 844 or higher). Data suggest that, although laboratory and method of measuring pressure are sources of variation in GPP estimation, it should be possible using appropriate mathematical models to standardise data among laboratories so that data from one laboratory could be extrapolated to others. This would allow development of a database of GPP data from many diverse feeds. (c) 2005 Published by Elsevier B.V.
Resumo:
Risk management (RM) comprises of risk identification, risk analysis, response planning, monitoring and action planning tasks that are carried out throughout the life cycle of a project in order to ensure that project objectives are met. Although the methodological aspects of RM are well-defined, the philosophical background is rather vague. In this paper, a learning-based approach is proposed. In order to implement this approach in practice, a tool has been developed to facilitate construction of a lessons learned database that contains risk-related information and risk assessment throughout the life cycle of a project. The tool is tested on a real construction project. The case study findings demonstrate that it can be used for storing as well as updating risk-related information and finally, carrying out a post-project appraisal. The major weaknesses of the tool are identified as, subjectivity of the risk rating process and unwillingness of people to enter information about reasons of failure.
Resumo:
Requirements management (RM), as practised in the aerospace and defence sectors, attracts interest from construction researchers in response to longstanding problems of project definition. Doubts are expressed whether RM offers a new discipline for construction practitioners or whether it repeats previous exhortations to adopt a more disciplined way of working. Whilst systems engineering has an established track record of addressing complex technical problems, its extension to socially complex problems has been challenged. The dominant storyline of RM is one of procedural rationality and RM is commonly presented as a means of controlling dilettante behaviour. Interviews with RM practitioners suggest a considerable gulf between the dominant storyline in the literature and how practitioners operate in practice. The paper challenges construction researchers interested in RM to reflect more upon the theoretical debates that underpin current equivalent practices in construction and the disparity between espoused and enacted practice.
Resumo:
The mean state, variability and extreme variability of the stratospheric polar vortices, with an emphasis on the Northern Hemisphere vortex, are examined using 2-dimensional moment analysis and Extreme Value Theory (EVT). The use of moments as an analysis to ol gives rise to information about the vortex area, centroid latitude, aspect ratio and kurtosis. The application of EVT to these moment derived quantaties allows the extreme variability of the vortex to be assessed. The data used for this study is ECMWF ERA-40 potential vorticity fields on interpolated isentropic surfaces that range from 450K-1450K. Analyses show that the most extreme vortex variability occurs most commonly in late January and early February, consistent with when most planetary wave driving from the troposphere is observed. Composites around sudden stratospheric warming (SSW) events reveal that the moment diagnostics evolve in statistically different ways between vortex splitting events and vortex displacement events, in contrast to the traditional diagnostics. Histograms of the vortex diagnostics on the 850K (∼10hPa) surface over the 1958-2001 period are fitted with parametric distributions, and show that SSW events comprise the majority of data in the tails of the distributions. The distribution of each diagnostic is computed on various surfaces throughout the depth of the stratosphere, and shows that in general the vortex becomes more circular with higher filamentation at the upper levels. The Northern Hemisphere (NH) and Southern Hemisphere (SH) vortices are also compared through the analysis of their respective vortex diagnostics, and confirm that the SH vortex is less variable and lacks extreme events compared to the NH vortex. Finally extreme value theory is used to statistically mo del the vortex diagnostics and make inferences about the underlying dynamics of the polar vortices.
Resumo:
We propose and analyse a class of evolving network models suitable for describing a dynamic topological structure. Applications include telecommunication, on-line social behaviour and information processing in neuroscience. We model the evolving network as a discrete time Markov chain, and study a very general framework where, conditioned on the current state, edges appear or disappear independently at the next timestep. We show how to exploit symmetries in the microscopic, localized rules in order to obtain conjugate classes of random graphs that simplify analysis and calibration of a model. Further, we develop a mean field theory for describing network evolution. For a simple but realistic scenario incorporating the triadic closure effect that has been empirically observed by social scientists (friends of friends tend to become friends), the mean field theory predicts bistable dynamics, and computational results confirm this prediction. We also discuss the calibration issue for a set of real cell phone data, and find support for a stratified model, where individuals are assigned to one of two distinct groups having different within-group and across-group dynamics.
Resumo:
Haem in red meat (RM) stimulates the endogenous production of mutagenic nitroso compounds (NOC). Processed (nitrite-preserved red) meat additionally contains high concentrations of preformed NOC. In two studies, of a fresh RM versus a vegetarian (VEG) diet (six males and six females) and of a nitrite-preserved red meat (PM) versus a VEG diet (5 males and 11 females), we investigated whether processing of meat might increase colorectal cancer risk by stimulating nitrosation and DNA damage. Meat diets contained 420 g (males) or 366 g (females) meat/per day. Faecal homogenates from day 10 onwards were analysed for haem and NOC and asso- ciated supernatants for genotoxicity. Means are adjusted for differ- ences in male to female ratios between studies. Faecal NOC concentrations on VEG diets were low (2.6 and 3.5 mmol/g) but significantly higher on meat diets (PM 175 ± 19 nmol/g versus RM 185 ± 22 nmol/g; P 5 0.75). The RM diet resulted in a larger pro- portion of nitrosyl iron (RM 78% versus PM 54%; P < 0.0001) and less nitrosothiols (RM 12% versus PM 19%; P < 0.01) and other NOC (RM 10% versus PM 27%; P < 0.0001). There was no statis- tically significant difference in DNA breaks induced by faecal water (FW) following PM and RM diets (P 5 0.80). However, PM re- sulted in higher levels of oxidized pyrimidines (P < 0.05). Surpris- ingly, VEG diets resulted in significantly more FW-induced DNA strand breaks than the meat diets (P < 0.05), which needs to be clarified in further studies. Meats cured with nitrite have the same effect as fresh RM on endogenous nitrosation but show increased FW-induced oxidative DNA damage.
Resumo:
The degree to which palaeoclimatic changes in the Southern Hemisphere co-varied with events in the high latitude Northern Hemisphere during the Last Termination is a contentious issue, with conflicting evidence for the degree of ‘teleconnection’ between different regions of the Southern Hemisphere. The available hypotheses are difficult to test robustly, however, because there are few detailed palaeoclimatic records in the Southern Hemisphere. Here we present climatic reconstructions from the southwestern Pacific, a key region in the Southern Hemisphere because of the potentially important role it plays in global climate change. The reconstructions for the period 20–10 kyr BP were obtained from five sites along a transect from southern New Zealand, through Australia to Indonesia, supported by 125 calibrated 14C ages. Two periods of significant climatic change can be identified across the region at around 17 and 14.2 cal kyr BP, most probably associated with the onset of warming in the West Pacific Warm Pool and the collapse of Antarctic ice during Meltwater Pulse-1A, respectively. The severe geochronological constraints that inherently afflict age models based on radiocarbon dating and the lack of quantified climatic parameters make more detailed interpretations problematic, however. There is an urgent need to address the geochronological limitations, and to develop more precise and quantified estimates of the pronounced climate variations that clearly affected this region during the Last Termination.
Resumo:
A plasma source, sustained by the application of a floating high voltage (±15 kV) to parallel-plate electrodes at 50 Hz, has been achieved in a helium/air mixture at atmospheric pressure (P = 105 Pa) contained in a zip-locked plastic package placed in the electrode gap. Some of the physical and antimicrobial properties of this apparatus were established with a view to ascertain its performance as a prototype for the disinfection of fresh produce. The current–voltage (I–V) and charge–voltage (Q–V) characteristics of the system were measured as a function of gap distance d, in the range (3 × 103 ≤ Pd ≤ 1.0 × 104 Pa m). The electrical measurements showed this plasma source to exhibit the characteristic behaviour of a dielectric barrier discharge in the filamentary mode and its properties could be accurately interpreted by the two-capacitance in series model. The power consumed by the discharge and the reduced field strength were found to decrease quadratically from 12.0 W to 4.5 W and linearly from 140 Td to 50 Td, respectively, in the range studied. Emission spectra of the discharge were recorded on a relative intensity scale and the dominant spectral features could be assigned to strong vibrational bands in the 2+ and 1− systems of N2 and ${\rm N}_2^+$ , respectively, with other weak signatures from the NO and OH radicals and the N+, He and O atomic species. Absolute spectral intensities were also recorded and interpreted by comparison with the non-equilibrium synthetic spectra generated by the computer code SPECAIR. At an inter-electrode gap of 0.04 m, this comparison yielded typical values for the electron, vibrational and translational (gas) temperatures of (4980 ± 100) K, (2700 ± 200) K and (300 ± 100) K, respectively and an electron density of 1.0 × 1017 m−3. A Boltzmann plot also provided a value of (3200 ± 200 K) for the vibrational temperature. The antimicrobial efficacy was assessed by studying the resistance of both Escherichia coli K12 its isogenic mutants in soxR, soxS, oxyR, rpoS and dnaK selected to identify possible cellular responses and targets related with 5 min exposure to the active gas in proximity of, but not directly in, the path of the discharge filaments. Both the parent strain and mutants populations were significantly reduced by more than 1.5 log cycles in these conditions, showing the potential of the system. Post-treatment storage studies showed that some transcription regulators and specific genes related to oxidative stress play an important role in the E. coli repair mechanism and that plasma exposure affects specific cell regulator systems.
Resumo:
In probabilistic decision tasks, an expected value (EV) of a choice is calculated, and after the choice has been made, this can be updated based on a temporal difference (TD) prediction error between the EV and the reward magnitude (RM) obtained. The EV is measured as the probability of obtaining a reward x RM. To understand the contribution of different brain areas to these decision-making processes, functional magnetic resonance imaging activations related to EV versus RM (or outcome) were measured in a probabilistic decision task. Activations in the medial orbitofrontal cortex were correlated with both RM and with EV and confirmed in a conjunction analysis to extend toward the pregenual cingulate cortex. From these representations, TD reward prediction errors could be produced. Activations in areas that receive from the orbitofrontal cortex including the ventral striatum, midbrain, and inferior frontal gyrus were correlated with the TD error. Activations in the anterior insula were correlated negatively with EV, occurring when low reward outcomes were expected, and also with the uncertainty of the reward, implicating this region in basic and crucial decision-making parameters, low expected outcomes, and uncertainty.
Resumo:
Past research has documented a substitution effect between real earnings management (RM) and accrual-based earnings management (AM), depending on relative costs. This study contributes to this research by examining whether levels of (and changes in) financial leverage have an impact on this empirically documented trade-off. We hypothesise that in the presence of high leverage, firms that engage in earnings manipulation tactics will exhibit a preference for RM due to a lower possibility—and subsequent costs—of getting caught. We show that leverage levels and increases positively and significantly affect upward RM, with no significant effect on income-increasing AM, while our findings point towards a complementarity effect between unexpected levels of RM and AM for firms with very high leverage levels and changes. This is interpreted as an indication that high leverage could attract heavy outsider scrutiny, making it necessary for firms to use both forms of earnings management in order to achieve earnings targets. Furthermore, we document that equity investors exhibit a significantly stronger penalising reaction to AM vs. RM, indicating that leverage-induced RM is not as easily detectable by market participants as debt-induced AM, despite the fact that the former could imply deviation from optimal business practices.