890 resultados para phi value analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

“What is value in product development?” is the key question of this paper. The answer is critical to the creation of lean in product development. By knowing how much value is added by product development (PD) activities, decisions can be more rationally made about how to allocate resources, such as time and money. In order to apply the principles of Lean Thinking and remove waste from the product development system, value must be precisely defined. Unfortunately, value is a complex entity that is composed of many dimensions and has thus far eluded definition on a local level. For this reason, research has been initiated on “Measuring Value in Product Development.” This paper serves as an introduction to this research. It presents the current understanding of value in PD, the critical questions involved, and a specific research design to guide the development of a methodology for measuring value. Work in PD value currently focuses on either high-level perspectives on value, or detailed looks at the attributes that value might have locally in the PD process. Models that attempt to capture value in PD are reviewed. These methods, however, do not capture the depth necessary to allow for application. A methodology is needed to evaluate activities on a local level to determine the amount of value they add and their sensitivity with respect to performance, cost, time, and risk. Two conceptual tools are proposed. The first is a conceptual framework for value creation in PD, referred to here as the Value Creation Model. The second tool is the Value-Activity Map, which shows the relationships between specific activities and value attributes. These maps will allow a better understanding of the development of value in PD, will facilitate comparison of value development between separate projects, and will provide the information necessary to adapt process analysis tools (such as DSM) to consider value. The key questions that this research entails are: · What are the primary attributes of lifecycle value within PD? · How can one model the creation of value in a specific PD process? · Can a useful methodology be developed to quantify value in PD processes? · What are the tools necessary for application? · What PD metrics will be integrated with the necessary tools? The research milestones are: · Collection of value attributes and activities (September, 200) · Development of methodology of value-activity association (October, 2000) · Testing and refinement of the methodology (January, 2001) · Tool Development (March, 2001) · Present findings at July INCOSE conference (April, 2001) · Deliver thesis that captures a formalized methodology for defining value in PD (including LEM data sheets) (June, 2001) The research design aims for the development of two primary deliverables: a methodology to guide the incorporation of value, and a product development tool that will allow direct application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developments in the statistical analysis of compositional data over the last two decades have made possible a much deeper exploration of the nature of variability, and the possible processes associated with compositional data sets from many disciplines. In this paper we concentrate on geochemical data sets. First we explain how hypotheses of compositional variability may be formulated within the natural sample space, the unit simplex, including useful hypotheses of subcompositional discrimination and specific perturbational change. Then we develop through standard methodology, such as generalised likelihood ratio tests, statistical tools to allow the systematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require special construction. We comment on the use of graphical methods in compositional data analysis and on the ordination of specimens. The recent development of the concept of compositional processes is then explained together with the necessary tools for a staying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland. Finally we point out a number of unresolved problems in the statistical analysis of compositional processes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Genetic and epigenetic factors interacting with the environment over time are the main causes of complex diseases such as autoimmune diseases (ADs). Among the environmental factors are organic solvents (OSs), which are chemical compounds used routinely in commercial industries. Since controversy exists over whether ADs are caused by OSs, a systematic review and meta-analysis were performed to assess the association between OSs and ADs. Methods and Findings: The systematic search was done in the PubMed, SCOPUS, SciELO and LILACS databases up to February 2012. Any type of study that used accepted classification criteria for ADs and had information about exposure to OSs was selected. Out of a total of 103 articles retrieved, 33 were finally included in the meta-analysis. The final odds ratios (ORs) and 95% confidence intervals (CIs) were obtained by the random effect model. A sensitivity analysis confirmed results were not sensitive to restrictions on the data included. Publication bias was trivial. Exposure to OSs was associated to systemic sclerosis, primary systemic vasculitis and multiple sclerosis individually and also to all the ADs evaluated and taken together as a single trait (OR: 1.54; 95% CI: 1.25-1.92; p-value, 0.001). Conclusion: Exposure to OSs is a risk factor for developing ADs. As a corollary, individuals with non-modifiable risk factors (i.e., familial autoimmunity or carrying genetic factors) should avoid any exposure to OSs in order to avoid increasing their risk of ADs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

May cartoons be considered as a viable and credible source for the study of economics? There is hardly any research on the subject, even though there is a quite significant amount of cartoons with economic content. This suggests that economics (and economists) have not paid enough attention and do not incorporate in their analysis a relevant primary source. The present paper aims to explore the value of using cartoons as a complementary primary source in economic analysis. We present a way of analyzing economic history through cartoons; first, reviewing cartoons which describe particular historical circumstances and second, examining cartoons that represent generic economic situations and are not necessarily linked to a historical period. We choose 17 cartoons, from different cartoonist, especially Colombian cartoonists that may give us an idea of economic matters and economic history.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines two individually administered diagnostic reading tests, the Woodcock Reading Mastery Tests and the Diagnostic Reading Scales, to determine their value for use with hearing-impaired children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the extension of a methodology to solve moving boundary value problems from the second-order case to the case of the third-order linear evolution PDE qt + qxxx = 0. This extension is the crucial step needed to generalize this methodology to PDEs of arbitrary order. The methodology is based on the derivation of inversion formulae for a class of integral transforms that generalize the Fourier transform and on the analysis of the global relation associated with the PDE. The study of this relation and its inversion using the appropriate generalized transform are the main elements of the proof of our results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water table response to rainfall was investigated at six sites in the Upper, Middle and Lower Chalk of southern England. Daily time series of rainfall and borehole water level were cross-corretated to investigate seasonal variations in groundwater-level response times, based on periods of 3-month duration. The time tags (in days) yielding significant correlations were compared with the average unsaturated zone thickness during each 3-month period. In general, for cases when the unsaturated zone was greater than 18 m thick, the time tag for a significant water-level response increased rapidly once the depth to the water table exceeded a critical value, which varied from site to site. For shallower water tables, a linear relationship between the depth to the water table and the water-level response time was evident. The observed variations in response time can only be partially accounted for using a diffusive model for propagation through the unsaturated matrix, suggesting that some fissure flow was occurring. The majority of rapid responses were observed during the winter/spring recharge period, when the unsaturated zone is thinnest and the unsaturated zone moisture content is highest, and were more likely to occur when the rainfall intensity exceeded 5 mm/day. At some sites, a very rapid response within 24 h of rainfall was observed in addition to the longer term responses even when the unsaturated zone was up to 64 m thick. This response was generally associated with the autumn period. The results of the cross-correlation analysis provide statistical support for the presence of fissure flow and for the contribution of multiple pathways through the unsaturated zone to groundwater recharge. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are now considerable expectations that semi-distributed models are useful tools for supporting catchment water quality management. However, insufficient attention has been given to evaluating the uncertainties inherent to this type of model, especially those associated with the spatial disaggregation of the catchment. The Integrated Nitrogen in Catchments model (INCA) is subjected to an extensive regionalised sensitivity analysis in application to the River Kennet, part of the groundwater-dominated upper Thames catchment, UK The main results are: (1) model output was generally insensitive to land-phase parameters, very sensitive to groundwater parameters, including initial conditions, and significantly sensitive to in-river parameters; (2) INCA was able to produce good fits simultaneously to the available flow, nitrate and ammonium in-river data sets; (3) representing parameters as heterogeneous over the catchment (206 calibrated parameters) rather than homogeneous (24 calibrated parameters) produced a significant improvement in fit to nitrate but no significant improvement to flow and caused a deterioration in ammonium performance; (4) the analysis indicated that calibrating the flow-related parameters first, then calibrating the remaining parameters (as opposed to calibrating all parameters together) was not a sensible strategy in this case; (5) even the parameters to which the model output was most sensitive suffered from high uncertainty due to spatial inconsistencies in the estimated optimum values, parameter equifinality and the sampling error associated with the calibration method; (6) soil and groundwater nutrient and flow data are needed to reduce. uncertainty in initial conditions, residence times and nitrogen transformation parameters, and long-term historic data are needed so that key responses to changes in land-use management can be assimilated. The results indicate the general, difficulty of reconciling the questions which catchment nutrient models are expected to answer with typically limited data sets and limited knowledge about suitable model structures. The results demonstrate the importance of analysing semi-distributed model uncertainties prior to model application, and illustrate the value and limitations of using Monte Carlo-based methods for doing so. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stable isotopic characterization of chlorine in chlorinated aliphatic pollution is potentially very valuable for risk assessment and monitoring remediation or natural attenuation. The approach has been underused because of the complexity of analysis and the time it takes. We have developed a new method that eliminates sample preparation. Gas chromatography produces individually eluted sample peaks for analysis. The He carrier gas is mixed with Ar and introduced directly into the torch of a multicollector ICPMS. The MC-ICPMS is run at a high mass resolution of >= 10 000 to eliminate interference of mass 37 ArH with Cl. The standardization approach is similar to that for continuous flow stable isotope analysis in which sample and reference materials are measured successively. We have measured PCE relative to a laboratory TCE standard mixed with the sample. Solvent samples of 200 nmol to 1.3 mu mol ( 24- 165 mu g of Cl) were measured. The PCE gave the same value relative to the TCE as measured by the conventional method with a precision of 0.12% ( 2 x standard error) but poorer precision for the smaller samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A reference model of Fallible Endgame Play has been implemented and exercised with the chess-engine WILHELM. Past experiments have demonstrated the value of the model and the robustness of decisions based on it: experiments agree well with a Markov Model theory. Here, the reference model is exercised on the well-known endgame KBBKN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A reference model of Fallible Endgame Play has been implemented and exercised with the chess engine WILHELM. Various experiments have demonstrated the value of the model and the robustness of decisions based on it. Experimental results have also been compared with the theoretical predictions of a Markov model of the endgame and found to be in close agreement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider boundary value problems for the N-wave interaction equations in one and two space dimensions, posed for x [greater-or-equal, slanted] 0 and x,y [greater-or-equal, slanted] 0, respectively. Following the recent work of Fokas, we develop an inverse scattering formalism to solve these problems by considering the simultaneous spectral analysis of the two ordinary differential equations in the associated Lax pair. The solution of the boundary value problems is obtained through the solution of a local Riemann–Hilbert problem in the one-dimensional case, and a nonlocal Riemann–Hilbert problem in the two-dimensional case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the elliptic sine-Gordon equation in the quarter plane using a spectral transform approach. We determine the Riemann-Hilbert problem associated with well-posed boundary value problems in this domain and use it to derive a formal representation of the solution. Our analysis is based on a generalization of the usual inverse scattering transform recently introduced by Fokas for studying linear elliptic problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.