939 resultados para monitoring process mean and variance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Finnish legislation requires for a safe and secure learning environment. However, the comprehensive, risk based safety and security management (SSM) and the management commitment in the implementation and development of the SSM are not mentioned in the legislation. Multiple institutions, operators and researchers have studied and developed safety and security in educational institutions over the past decade. Typically the approach has been fragmented and without bringing up the importance of the comprehensive SSM. The development needs of the safety and security operations in universities have been studied. However, in universities of applied sciences (UASs) and in elementary schools (ESs), the performance level, strengths and weaknesses of the comprehensive SSM have not been studied. The objective of this study was to develop the comprehensive, risk based SSM of educational institutions by developing the new Asteri consultative auditing process and study its effects on auditees. Furthermore, the performance level in the comprehensive SSM in UASs and ESs were studied using Asteri and the TUTOR model developed by the Keski-Uusimaa Department for Rescue Services. In addition, strengths, development needs and differences were identified. In total, 76 educational institutions were audited between the years 2011 and 2014. The study is based on logical empiricism, and an observational applied research design was used. Auditing, observation and an electronic survey were used for data collection. Statistical analysis was used to analyze the collected information. In addition, thematic analysis was used to analyze the development areas of the organizations mentioned by the respondents in the survey. As one of the main contributions, this research presents the new Asteri consultative auditing process. Organizations with low performance levels on the audited subject benefit the most from the Asteri consultative auditing process. Asteri may be usable in many different types of audits, not only in SSM audits. As a new result, this study provides new knowledge on attitudes related to auditing. According to the research findings, auditing may generate negative attitudes and the auditor should take them into account when planning and preparing for audits. Negative attitudes can be compensated by producing added value, objectivity and positivity for the audit and, thus, improve the positive effects of auditing on knowledge and skills. Moreover, as the results of this study shows, auditing safety and security issues do not increase feelings of insecurity, but rather increase feelings of safety and security when using the new Asteri consultative auditing process with the TUTOR model. The results showed that the SSM in the audited UASs was statistically significantly more advanced than that in the audited ESs. However, there is still room for improvement in the ESs and the UASs as the approach to the SSM was fragmented. It can be assumed that the majority of Finnish UASs and ESs do not likely meet the basic level of the comprehensive, risk based the SSM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this research is to create a current state analysis of pulp supply chain processes from production planning to deliveries to customers. A cross-functional flowchart is being used to model these processes. These models help finding key performance indicators (KPIs) which enable examinations of the supply chain efficiency. Supply chain measures in different processes reveal the changes need processes that affect the whole supply chain and its efficiency and competitiveness. Structure of pulp supply chain differs from most of the other supply chains. The fact that there are big volumes of bulk products, small product variations and supply forecasts are made for the year ahead make the difference. This factor brings different benefits but also challenges when developing supply chain. This thesis divides pulp supply chain in three different main categories: production planning, warehousing and transportation. It provides tools for estimating the functionality of supply chain as well as developing the efficiency for different functions of supply chain. By having a better understanding of supply chain processes and measurement the whole supply chain structure can be developed significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability to monitor and evaluate the consequences of ongoing behaviors and coordinate behavioral adjustments seems to rely on networks including the anterior cingulate cortex (ACC) and phasic changes in dopamine activity. Activity (and presumably functional maturation) of the ACC may be indirectly measured using the error-related negativity (ERN), an event-related potential (ERP) component that is hypothesized to reflect activity of the automatic response monitoring system. To date, no studies have examined the measurement reliability of the ERN as a trait-like measure of response monitoring, its development in mid- and late- adolescence as well as its relation to risk-taking and empathic ability, two traits linked to dopaminergic and ACC activity. Utilizing a large sample of 15- and 18-year-old males, the present study examined the test-retest reliability of the ERN, age-related changes in the ERN and other components of the ERP associated with error monitoring (the Pe and CRN), and the relations of the error-related ERP components to personality traits of risk propensity and empathy. Results indicated good test-retest reliability of the ERN providing important validation of the ERN as a stable and possibly trait-like electrophysiological correlate of performance monitoring. Ofthe three components, only the ERN was of greater amplitude for the older adolescents suggesting that its ACC network is functionally late to mature, due to either structural or neurochemical changes with age. Finally, the ERN was smaller for those with high risk propensity and low empathy, while other components associated with error monitoring were not, which suggests that poor ACe function may be associated with the desire to engage in risky behaviors and the ERN may be influenced by the extent of individuals' concern with the outcome of events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parental monitoring has long been stressed as an important parenting practice in reducing adolescent susceptibility to depression. An extensive review by Stattin and Kerr (2000), however, , revealed that researchers had confounded perceptions of parental monitoring (i.e., parental solicitation and control) with parental knowledge, and neglected to consider the role of adolescent willingness to disclose. In the present study, adolescents (N = 1995; 51.3% female) were surveyed at two time points (grade 10 and 11). To disentangle the role of perceived parenting, three central issues were addressed. First, the present study examined whether parental knowledge, adolescent disclosure, and parental monitoring (i.e., parental solicitation and control) in grade 10 predicted adolescent depression in grade 11. Second, the predictive value of adolescent depression in grade lOon parental knowledge, adolescent disclosure, parental solicitation and parental control in grade 11 was considered. Lastly, associations among parental knowledge, adolescent disclosure, parental solicitation and parental control were examined over time. Findings indicated that higher levels of parental knowledge were associated with subsequent lower levels of depressive symptoms, and that depressive symptoms predicted lower levels of parental knowledge over time. Both adolescent willingness to disclose and parental control predicted higher parental knowledge. These findings underscore the role of adolescent and perceived parent contributions to parental knowledge, and highlight the importance of perceived parental knowledge in predicting reduced adolescent susceptibility to depression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we provide both qualitative and quantitative measures of the cost of measuring the integrated volatility by the realized volatility when the frequency of observation is fixed. We start by characterizing for a general diffusion the difference between the realized and the integrated volatilities for a given frequency of observations. Then, we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has, as special examples, log-normal, affine, and GARCH diffusion models. Using some previous empirical works, we show that the standard deviation of the noise is not negligible with respect to the mean and the standard deviation of the integrated volatility, even if one considers returns at five minutes. We also propose a simple approach to capture the information about the integrated volatility contained in the returns through the leverage effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a model of money demand where the opportunity cost of holding money is subject to regime changes. The regimes are fully characterized by the mean and variance of inflation and are assumed to be the result of alternative government policies. Agents are unable to directly observe whether government actions are indeed consistent with the inflation rate targeted as part of a stabilization program but can construct probability inferences on the basis of available observations of inflation and money growth. Government announcements are assumed to provide agents with additional, possibly truthful information regarding the regime. This specification is estimated and tested using data from the Israeli and Argentine high inflation periods. Results indicate the successful stabilization program implemented in Israel in July 1985 was more credible than either the earlier Israeli attempt in November 1984 or the Argentine programs. Government’s signaling might substantially simplify the inference problem and increase the speed of learning on the part of the agents. However, under certain conditions, it might increase the volatility of inflation. After the introduction of an inflation stabilization plan, the welfare gains from a temporary increase in real balances might be high enough to induce agents to raise their real balances in the short-term, even if they are uncertain about the nature of government policy and the eventual outcome of the stabilization attempt. Statistically, the model restrictions cannot be rejected at the 1% significance level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The preceding two editions of CoDaWork included talks on the possible consideration of densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended the Euclidean structure of the simplex to a Hilbert space structure of the set of densities within a bounded interval, and van den Boogaart (2005) generalized this to the set of densities bounded by an arbitrary reference density. From the many variations of the Hilbert structures available, we work with three cases. For bounded variables, a basis derived from Legendre polynomials is used. For variables with a lower bound, we standardize them with respect to an exponential distribution and express their densities as coordinates in a basis derived from Laguerre polynomials. Finally, for unbounded variables, a normal distribution is used as reference, and coordinates are obtained with respect to a Hermite-polynomials-based basis. To get the coordinates, several approaches can be considered. A numerical accuracy problem occurs if one estimates the coordinates directly by using discretized scalar products. Thus we propose to use a weighted linear regression approach, where all k- order polynomials are used as predictand variables and weights are proportional to the reference density. Finally, for the case of 2-order Hermite polinomials (normal reference) and 1-order Laguerre polinomials (exponential), one can also derive the coordinates from their relationships to the classical mean and variance. Apart of these theoretical issues, this contribution focuses on the application of this theory to two main problems in sedimentary geology: the comparison of several grain size distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock or sediment, like their composition

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous assessments of the impacts of climate change on heat-related mortality use the "delta method" to create temperature projection time series that are applied to temperature-mortality models to estimate future mortality impacts. The delta method means that climate model bias in the modelled present does not influence the temperature projection time series and impacts. However, the delta method assumes that climate change will result only in a change in the mean temperature but there is evidence that there will also be changes in the variability of temperature with climate change. The aim of this paper is to demonstrate the importance of considering changes in temperature variability with climate change in impacts assessments of future heat-related mortality. We investigate future heatrelated mortality impacts in six cities (Boston, Budapest, Dallas, Lisbon, London and Sydney) by applying temperature projections from the UK Meteorological Office HadCM3 climate model to the temperature-mortality models constructed and validated in Part 1. We investigate the impacts for four cases based on various combinations of mean and variability changes in temperature with climate change. The results demonstrate that higher mortality is attributed to increases in the mean and variability of temperature with climate change rather than with the change in mean temperature alone. This has implications for interpreting existing impacts estimates that have used the delta method. We present a novel method for the creation of temperature projection time series that includes changes in the mean and variability of temperature with climate change and is not influenced by climate model bias in the modelled present. The method should be useful for future impacts assessments. Few studies consider the implications that the limitations of the climate model may have on the heatrelated mortality impacts. Here, we demonstrate the importance of considering this by conducting an evaluation of the daily and extreme temperatures from HadCM3, which demonstrates that the estimates of future heat-related mortality for Dallas and Lisbon may be overestimated due to positive climate model bias. Likewise, estimates for Boston and London may be underestimated due to negative climate model bias. Finally, we briefly consider uncertainties in the impacts associated with greenhouse gas emissions and acclimatisation. The uncertainties in the mortality impacts due to different emissions scenarios of greenhouse gases in the future varied considerably by location. Allowing for acclimatisation to an extra 2°C in mean temperatures reduced future heat-related mortality by approximately half that of no acclimatisation in each city.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to demonstrate the importance of changing temperature variability with climate change in assessments of future heat-related mortality. Previous studies have only considered changes in the mean temperature. Here we present estimates of heat-related mortality resulting from climate change for six cities: Boston, Budapest, Dallas, Lisbon, London and Sydney. They are based on climate change scenarios for the 2080s (2070-2099) and the temperature-mortality (t-m) models constructed and validated in Gosling et al. (2007). We propose a novel methodology for assessing the impacts of climate change on heat-related mortality that considers both changes in the mean and variability of the temperature distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of temperature in the determination of the yield of an annual crop (groundnut; Arachis hypogaea L. in India) was assessed. Simulations from a regional climate model (PRECIS) were used with a crop model (GLAM) to examine crop growth under simulated current (1961-1990) and future (2071-2100) climates. Two processes were examined: the response of crop duration to mean temperature and the response of seed-set to extremes of temperature. The relative importance of, and interaction between, these two processes was examined for a number of genotypic characteristics, which were represented by using different values of crop model parameters derived from experiments. The impact of mean and extreme temperatures varied geographically, and depended upon the simulated genotypic properties. High temperature stress was not a major determinant of simulated yields in the current climate, but affected the mean and variability of yield under climate change in two regions which had contrasting statistics of daily maximum temperature. Changes in mean temperature had a similar impact on mean yield to that of high temperature stress in some locations and its effects were more widespread. Where the optimal temperature for development was exceeded, the resulting increase in duration in some simulations fully mitigated the negative impacts of extreme temperatures when sufficient water was available for the extended growing period. For some simulations the reduction in mean yield between the current and future climates was as large as 70%, indicating the importance of genotypic adaptation to changes in both means and extremes of temperature under climate change. (c) 2006 Elsevier B.V. All rights reserved.