893 resultados para Statistical process control


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Big Tobacco has been engaged in a dark, shadowy plot and conspiracy to hijack the Trans-Pacific Partnership Agreement (TPP) and undermine tobacco control measures – such as graphic health warnings and the plain packaging of tobacco products... In the context of this heavy lobbying by Big Tobacco and its proxies, this chapter provides an analysis of the debate over trade, tobacco, and the TPP. This discussion is necessarily focused on the negotiations of the free trade agreement – the shadowy conflicts before the finalisation of the text. This chapter contends that the trade negotiations threaten hard-won gains in public health – including international developments such as the WHO Framework Convention on Tobacco Control, and domestic measures, such as graphic health warnings and the plain packaging of tobacco products. It maintains that there is a need for regional trade agreements to respect the primacy of the WHO Framework Convention on Tobacco Control. There is a need both to provide for an open and transparent process regarding such trade negotiations, as well as a due and proper respect for public health in terms of substantive obligations. Part I focuses on the debate over the intellectual property chapter of the TPP, within the broader context of domestic litigation against Australia’s plain tobacco packaging regime and associated WTO disputes. Part II examines the investment chapter of the TPP, taking account of ongoing investment disputes concerning tobacco control and the declared approaches of Australia and New Zealand to investor-state dispute settlement. Part III looks at the discussion as to whether there should be specific text on tobacco control in the TPP, and, if so, what should be its nature and content. This chapter concludes that the plain packaging of tobacco products – and other best practices in tobacco control – should be adopted by members of the Pacific Rim.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We defined a new statistical fluid registration method with Lagrangian mechanics. Although several authors have suggested that empirical statistics on brain variation should be incorporated into the registration problem, few algorithms have included this information and instead use regularizers that guarantee diffeomorphic mappings. Here we combine the advantages of a large-deformation fluid matching approach with empirical statistics on population variability in anatomy. We reformulated the Riemannian fluid algorithmdeveloped in [4], and used a Lagrangian framework to incorporate 0 th and 1st order statistics in the regularization process. 92 2D midline corpus callosum traces from a twin MRI database were fluidly registered using the non-statistical version of the algorithm (algorithm 0), giving initial vector fields and deformation tensors. Covariance matrices were computed for both distributions and incorporated either separately (algorithm 1 and algorithm 2) or together (algorithm 3) in the registration. We computed heritability maps and two vector and tensorbased distances to compare the power and the robustness of the algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study developed and tested a model of job uncertainty for survivors and victims of downsizing. Data were collected from three samples of employees in a public hospital, each representing three phases of the downsizing process: immediately before the announcement of the redeployment of staff, during the implementation of the downsizing, and towards the end of the official change programme. As predicted, levels of job uncertainty and personal control had a direct relationship with emotional exhaustion and job satisfaction. In addition, there was evidence to suggest that personal control mediated the relationship between job uncertainty and employee adjustment, a pattern of results that varied across each of the three phases of the change event. From the perspective of the organization’s overall climate, it was found that levels of job uncertainty, personal control and job satisfaction improved and/or stabilized over the downsizing process. During the implementation phase, survivors experienced higher levels of personal control than victims, but both groups of employees reported similar levels of job uncertainty. We discuss the implications of our results for strategically managing uncertainty during and after organizational change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper applies concepts Deleuze developed in his ‘Postscript on the Societies of Control’, especially those relating to modulatory power, dividuation and control, to aspects of Australian schooling to explore how this transition is manifesting itself. Two modulatory machines of assessment, NAPLAN and My Schools, are examined as a means to better understand how the disciplinary institution is changing as a result of modulation. This transition from discipline to modulation is visible in the declining importance of the disciplinary teacher–student relationship as a measure of the success of the educative process. The transition occurs through seduction because that which purports to measure classroom quality is in fact a serpent of modulation that produces simulacra of the disciplinary classroom. The effect is to sever what happens in the disciplinary space from its representations in a luminiferous ether that overlays the classroom.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While anecdotal evidence indicates financial advice affects consumers’ financial well-being, this research project is motivated by the absence of empirically-grounded research relating to the extent to which, and, importantly, how, financial planning advice contributes to broader client well-being. Accordingly, the aim of this project is to establish how the quality of financial planning advice can be optimised to add value, not only to clients’ financial situation, but also to broader aspects of their well-being. This broader construct of well-being captures a range of process and outcome factors that map to concepts of security, control, choice, mastery, and life satisfaction (Irving, 2012; Gallery, Gallery, Irving & Newton, 2011; Irving, Gallery, and Gallery, 2009). Financial planning is commonly purported to confer not only tangible benefits, but also intangible benefits, such as increased security and peace of mind that are considered as important, if not more important, than material outcomes. Such claims are intuitively appealing; however, little empirical evidence exists for the notion that engaging with a financial planner or adviser promotes peace of mind, feelings of security, and expands choices and possibilities. Nor is there evidence signalling what mechanisms might underpin such client benefits. In addressing this issue, we examine the financial planning advice (including financial product advice) provided to retail clients, and consider the short- and longer-term impacts on clients’ financial satisfaction and broader well-being. To this end, we examine both process (e.g., how financial planning advice is given) and outcome (e.g., financial situation) effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As statistical education becomes more firmly embedded in the school curriculum and its value across the curriculum is recognised, attention moves from knowing procedures, such as calculating a mean or drawing a graph, to understanding the purpose of a statistical investigation in decision making in many disciplines. As students learn to complete the stages of an investigation, the question of meaningful assessment of the process arises. This paper considers models for carrying out a statistical inquiry and, based on a four-phase model, creates a developmental squence that can be used for the assessment of outcomes from each of the four phases as well as for the complete inquiry. The developmental sequence is based on the SOLO model, focussing on the "observed" outcomes during the inquiry process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular phylogenetic studies of homologous sequences of nucleotides often assume that the underlying evolutionary process was globally stationary, reversible, and homogeneous (SRH), and that a model of evolution with one or more site-specific and time-reversible rate matrices (e.g., the GTR rate matrix) is enough to accurately model the evolution of data over the whole tree. However, an increasing body of data suggests that evolution under these conditions is an exception, rather than the norm. To address this issue, several non-SRH models of molecular evolution have been proposed, but they either ignore heterogeneity in the substitution process across sites (HAS) or assume it can be modeled accurately using the distribution. As an alternative to these models of evolution, we introduce a family of mixture models that approximate HAS without the assumption of an underlying predefined statistical distribution. This family of mixture models is combined with non-SRH models of evolution that account for heterogeneity in the substitution process across lineages (HAL). We also present two algorithms for searching model space and identifying an optimal model of evolution that is less likely to over- or underparameterize the data. The performance of the two new algorithms was evaluated using alignments of nucleotides with 10 000 sites simulated under complex non-SRH conditions on a 25-tipped tree. The algorithms were found to be very successful, identifying the correct HAL model with a 75% success rate (the average success rate for assigning rate matrices to the tree's 48 edges was 99.25%) and, for the correct HAL model, identifying the correct HAS model with a 98% success rate. Finally, parameter estimates obtained under the correct HAL-HAS model were found to be accurate and precise. The merits of our new algorithms were illustrated with an analysis of 42 337 second codon sites extracted from a concatenation of 106 alignments of orthologous genes encoded by the nuclear genomes of Saccharomyces cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, S. castellii, S. kluyveri, S. bayanus, and Candida albicans. Our results show that second codon sites in the ancestral genome of these species contained 49.1% invariable sites, 39.6% variable sites belonging to one rate category (V1), and 11.3% variable sites belonging to a second rate category (V2). The ancestral nucleotide content was found to differ markedly across these three sets of sites, and the evolutionary processes operating at the variable sites were found to be non-SRH and best modeled by a combination of eight edge-specific rate matrices (four for V1 and four for V2). The number of substitutions per site at the variable sites also differed markedly, with sites belonging to V1 evolving slower than those belonging to V2 along the lineages separating the seven species of Saccharomyces. Finally, sites belonging to V1 appeared to have ceased evolving along the lineages separating S. cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, and S. bayanus, implying that they might have become so selectively constrained that they could be considered invariable sites in these species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis provides two main contributions. The first one is BP-TRBAC, a unified authorisation model that can support legacy systems as well as business process systems. BP-TRBAC supports specific features that are required by business process environments. BP-TRBAC is designed to be used as an independent enterprise-wide authorisation model, rather than having it as part of the workflow system. It is designed to be the main authorisation model for an organisation. The second contribution is BP-XACML, an authorisation policy language that is designed to represent BPM authorisation policies for business processes. The contribution also includes a policy model for BP-XACML. Using BP-TRBAC as an authorisation model together with BP-XACML as an authorisation policy language will allow an organisation to manage and control authorisation requests from workflow systems and other legacy systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of statistical analysis of experi- mental investigations is to make predictions on the basis of mathematical equations so as the number of experiments. Abrasive jet machining (AJM) is an unconventional and novel machining process wherein microabrasive particles are propelled at high veloc- ities on to a workpiece. The resulting erosion can be used for cutting, etching, cleaning, deburring, drilling and polishing. In the study completed by the authors, statistical design of experiments was successfully employed to predict the rate of material removal by AJM. This paper discusses the details of such an approach and the findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process view technology is catching more attentions in modern business process management, as it enables the customisation of business process representation. This capability helps improve the privacy protection, authority control, flexible display, etc., in business process modelling. One of approaches to generate process views is to allow users to construct an aggregate on their underlying processes. However, most aggregation approaches stick to a strong assumption that business processes are always well-structured, which is over strict to BPMN. Aiming to build process views for non-well-structured BPMN processes, this paper investigates the characteristics of BPMN structures, tasks, events, gateways, etc., and proposes a formal process view aggregation approach to facilitate BPMN process view creation. A set of consistency rules and construction rules are defined to regulate the aggregation and guarantee the order preservation, structural and behaviour correctness and a novel aggregation technique, called EP-Fragment, is developed to tackle non-well-structured BPMN processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process view concept deploys a partial and temporal representation to adjust the visible view of a business process according to various perception constraints of users. Process view technology is of practical use for privacy protection and authorization control in process-oriented business management. Owing to complex organizational structure, it is challenging for large companies to accurately specify the diverse perception of different users over business processes. Aiming to tackle this issue, this article presents a role-based process view model to incorporate role dependencies into process view derivation. Compared to existing process view approaches, ours particularly supports runtime updates to the process view perceivable to a user with specific view merging operations, thereby enabling the dynamic tracing of process perception. A series of rules and theorems are established to guarantee the structural consistency and validity of process view transformation. A hypothetical case is conducted to illustrate the feasibility of our approach, and a prototype is developed for the proof-of-concept purpose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines and quantifies the effect of adding polyelectrolytes to cellulose nanofibre suspensions on the gel point of cellulose nanofibre suspensions, which is the lowest solids concentration at which the suspension forms a continuous network. The lower the gel point, the faster the drainage time to produce a sheet and the higher the porosity of the final sheet formed. Two new techniques were designed to measure the dynamic compressibility and the drainability of nanocellulose–polyelectrolyte suspensions. We developed a master curve which showed that the independent variable controlling the behaviour of nanocellulose suspensions and its composite is the structure of the flocculated suspension which is best quantified as the gel point. This was independent of the type of polyelectrolyte used. At an addition level of 2 mg/g of nanofibre, a reduction in gel point over 50 % was achieved using either a high molecular weight (13 MDa) linear cationic polyacrylamide (CPAM, 40 % charge), a dendrimer polyethylenimine of high molecular weight of 750,000 Da (HPEI) or even a low molecular weight of 2000 Da (LPEI). There was no significant difference in the minimum gel point achieved, despite the difference in polyelectrolyte morphology and molecular weight. In this paper, we show that the gel point controls the flow through the fibre suspension, even when comparing fibre suspensions with solids content above the gel point. A lower gel point makes it easier for water to drain through the fibre network,reducing the pressure required to achieve a given dewatering rate and reducing the filtering time required to form a wet laid sheet. We further show that the lower gel point partially controls the structure of the wet laid sheet after it is dried. Halving the gel point increased the air permeability of the dry sheet by 37, 46 and 25 %, when using CPAM, HPEI and LPEI, respectively. The resistance to liquid flow was reduced by 74 and 90 %, when using CPAM and LPEI. Analysing the paper formed shows that sheet forming process and final sheet properties can be engineered and controlled by adding polyelectrolytes to the nanofibre suspension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this article is to show the applicability and benefits of the techniques of design of experiments as an optimization tool for discrete simulation models. The simulated systems are computational representations of real-life systems; its characteristics include a constant evolution that follows the occurrence of discrete events along the time. In this study, a production system, designed with the business philosophy JIT (Just in Time) is used, which seeks to achieve excellence in organizations through waste reduction in all the operational aspects. The most typical tool of JIT systems is the KANBAN production control that seeks to synchronize demand with flow of materials, minimize work in process, and define production metrics. Using experimental design techniques for stochastic optimization, the impact of the operational factors on the efficiency of the KANBAN / CONWIP simulation model is analyzed. The results show the effectiveness of the integration of experimental design techniques and discrete simulation models in the calculation of the operational parameters. Furthermore, the reliability of the methodologies found was improved with a new statistical consideration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.