901 resultados para two-stage sequential procedure


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to develop a comprehensive taxonomy of green supply chain management (GSCM) practices and develop a structural equation modelling-driven decision support system following GSCM taxonomy for managers to provide better understanding of the complex relationship between the external and internal factors and GSCM operational practices. Typology and/or taxonomy play a key role in the development of social science theories. The current taxonomies focus on a single or limited component of the supply chain. Furthermore, they have not been tested using different sample compositions and contexts, yet replication is a prerequisite for developing robust concepts and theories. In this paper, we empirically replicate one such taxonomy extending the original study by (a) developing broad (containing the key components of supply chain) taxonomy; (b) broadening the sample by including a wider range of sectors and organisational size; and (c) broadening the geographic scope of the previous studies. Moreover, we include both objective measures and subjective attitudinal measurements. We use a robust two-stage cluster analysis to develop our GSCM taxonomy. The main finding validates the taxonomy previously proposed and identifies size, attitude and level of environmental risk and impact as key mediators between internal drivers, external drivers and GSCM operational practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a method for the recognition of complex actions. Our method combines automatic learning of simple actions and manual definition of complex actions in a single grammar. Contrary to the general trend in complex action recognition that consists in dividing recognition into two stages, our method performs recognition of simple and complex actions in a unified way. This is performed by encoding simple action HMMs within the stochastic grammar that models complex actions. This unified approach enables a more effective influence of the higher activity layers into the recognition of simple actions which leads to a substantial improvement in the classification of complex actions. We consider the recognition of complex actions based on person transits between areas in the scene. As input, our method receives crossings of tracks along a set of zones which are derived using unsupervised learning of the movement patterns of the objects in the scene. We evaluate our method on a large dataset showing normal, suspicious and threat behaviour on a parking lot. Experiments show an improvement of ~ 30% in the recognition of both high-level scenarios and their composing simple actions with respect to a two-stage approach. Experiments with synthetic noise simulating the most common tracking failures show that our method only experiences a limited decrease in performance when moderate amounts of noise are added.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the development of new therapies, it is not uncommon to test whether a new treatment works better than the existing treatment for all patients who suffer from a condition (full population) or for a subset of the full population (subpopulation). One approach that may be used for this objective is to have two separate trials, where in the first trial, data are collected to determine if the new treatment benefits the full population or the subpopulation. The second trial is a confirmatory trial to test the new treatment in the population selected in the first trial. In this paper, we consider the more efficient two-stage adaptive seamless designs (ASDs), where in stage 1, data are collected to select the population to test in stage 2. In stage 2, additional data are collected to perform confirmatory analysis for the selected population. Unlike the approach that uses two separate trials, for ASDs, stage 1 data are also used in the confirmatory analysis. Although ASDs are efficient, using stage 1 data both for selection and confirmatory analysis introduces selection bias and consequently statistical challenges in making inference. We will focus on point estimation for such trials. In this paper, we describe the extent of bias for estimators that ignore multiple hypotheses and selecting the population that is most likely to give positive trial results based on observed stage 1 data. We then derive conditionally unbiased estimators and examine their mean squared errors for different scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The low activity variant of the monoamine oxidase A (MAOA) functional promoter polymorphism, MAOA-LPR, in interaction with adverse environments (G × E) is associated with child and adult antisocial behaviour disorders. MAOA is expressed during foetal development so in utero G × E may influence early neurodevelopment. We tested the hypothesis that MAOA G × E during pregnancy predicts infant negative emotionality soon after birth. In an epidemiological longitudinal study starting in pregnancy, using a two stage stratified design, we ascertained MAOA-LPR status (low vs. high activity variants) from the saliva of 209 infants (104 boys and 105 girls), and examined predictions to observed infant negative emotionality at 5 weeks post-partum from life events during pregnancy. In analyses weighted to provide estimates for the general population, and including possible confounders for life events, there was an MAOA status by life events interaction (P = 0.017). There was also an interaction between MAOA status and neighbourhood deprivation (P = 0.028). Both interactions arose from a greater effect of increasing life events on negative emotionality in the MAOA-LPR low activity, compared with MAOA-LPR high activity infants. The study provides the first evidence of moderation by MAOA-LPR of the effect of the social environment in pregnancy on negative emotionality in infancy, an early risk for the development of child and adult antisocial behaviour disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sclera segmentation is shown to be of significant importance for eye and iris biometrics. However, sclera segmentation has not been extensively researched as a separate topic, but mainly summarized as a component of a broader task. This paper proposes a novel sclera segmentation algorithm for colour images which operates at pixel-level. Exploring various colour spaces, the proposed approach is robust to image noise and different gaze directions. The algorithm’s robustness is enhanced by a two-stage classifier. At the first stage, a set of simple classifiers is employed, while at the second stage, a neural network classifier operates on the probabilities’ space generated by the classifiers at stage 1. The proposed method was ranked the 1st in Sclera Segmentation Benchmarking Competition 2015, part of BTAS 2015, with a precision of 95.05% corresponding to a recall of 94.56%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Small and medium sized enterprises (SMEs) play an important role in the European economy. A critical challenge faced by SME leaders, as a consequence of the continuing digital technology revolution, is how to optimally align business strategy with digital technology to fully leverage the potential offered by these technologies in pursuit of longevity and growth. There is a paucity of empirical research examining how e-leadership in SMEs drives successful alignment between business strategy and digital technology fostering longevity and growth. To address this gap, in this paper we develop an empirically derived e-leadership model. Initially we develop a theoretical model of e-leadership drawing on strategic alignment theory. This provides a theoretical foundation on how SMEs can harness digital technology in support of their business strategy enabling sustainable growth. An in-depth empirical study was undertaken interviewing 42 successful European SME leaders to validate, advance and substantiate our theoretically driven model. The outcome of the two stage process – inductive development of a theoretically driven e-leadership model and deductive advancement to develop a complete model through in-depth interviews with successful European SME leaders – is an e-leadership model with specific constructs fostering effective strategic alignment. The resulting diagnostic model enables SME decision makers to exercise effective e-leadership by creating productive alignment between business strategy and digital technology improving longevity and growth prospects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modified fluorcanasite glasses were fabricated by either altering the molar ratios of Na(2)O and CaO or by adding P(2)O(5) to the parent stoichiometric glass compositions. Glasses were converted to glass-ceramics by a controlled two-stage heat treatment process. Rods (2 mm x 4 mm) were produced using the conventional lost-wax casting technique. Osteoconductive 45S5 bioglass was used as a reference material. Biocompatibility and osteoconductivity were investigated by implantation into healing defects (2 mm) in the midshaft of rabbit femora. Tissue response was investigated using conventional histology and scanning electron microscopy. Histological and histomorphometric evaluation of specimens after 12 weeks implantation showed significantly more bone contact with the surface of 45S5 bioglass implants when compared with other test materials. When the bone contact for each material was compared between experimental time points, the Glass-Ceramic 2 (CaO rich) group showed significant difference (p = 0.027) at 4 weeks, but no direct contact at 12 weeks. Histology and backscattered electron photomicrographs showed that modified fluorcanasite glass-ceramic implants had greater osteoconductivity than the parent stoichiometric composition. Of the new materials, fluorcanasite glass-ceramic implants modified by the addition of P(2)O(5) showed the greatest stimulation of new mineralized bone tissue formation adjacent to the implants after 4 and 12 weeks implantation. (C) 2010 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 94A: 760-768, 2010

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives. To assess the impact of chronic disease and the number of diseases on the various aspects of health-related quality of life (HRQOL) among the elderly in Sao Paulo, Brazil. Methods. The SF-36 (R) Health Survey was used to assess the impact of the most prevalent chronic diseases on HRQOL. A cross-sectional and population-based study was carried out with two-stage stratified cluster sampling. Data were obtained from a multicenter health survey administered through household interviews in several municipalities in the state of Sao Paulo. The study evaluated seven diseases-arthritis, back-pain, depression/anxiety, diabetes, hypertension, osteoporosis, and stroke-and their effects on quality of life. Results. Among the 1958 elderly individuals (60 years of age or older), 13.6% reported not having any of the illnesses, whereas 45.7% presented three or more chronic conditions. The presence of any of the seven chronic illnesses studied had a significant effect on the scores Of nearly all the SF-36 (R) scales. HRQOL achieved lower scores when related to depression/anxiety, osteoporosis, and stroke. The higher the number of diseases, the greater the negative effect on the SF-36 (R) dimensions. The presence of three or more diseases significantly affected HRQOL in all areas. The bodily pain, general health, and vitality scales were the most affected by diseases. Conclusions. The study detected a high prevalence of chronic diseases among the elderly population and found that the degree of impact on HRQOL depends on the type of disease. The results highlight the importance of preventing and controlling chronic diseases in order to reduce the number of comorbidities and lessen their impact on HRQOL among the elderly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The biosynthesis of poly(3-hydroxybutyrate-co-3-hydroxyvalerate) from sucrose and propionic acid by Burkholderia sacchari IPT 189 was studied using a two-stage bioreactor process. In the first stage, this bacterium was cultivated in a balanced culture medium until sucrose exhaustion. In the second stage, a solution containing sucrose and propionic acid as carbon source was fed to the bioreactor at various sucrose/propionic acid (s/p) ratios at a constant specific flow rate. Copolymers with 3HV content ranging from 40 down to 6.5 (mol%) were obtained with 3HV yield from propionic acid (Y-3HV/prop) increasing from 1.10 to 1.34 g g(-1). Copolymer productivity of 1 g l(-1) h(-1) was obtained with polymer biomass content rising up to 60% by increasing a specific flow rate at a constant s/p ratio. Increasing values of 3HV content were obtained by varying the s/p ratios. A simulation of production costs considering Y-3HV/prop obtained in the present work indicated that a reduction of up to 73% can be reached, approximating US$ 1.00 per kg which is closer to the value to produce P3HB from sucrose (US$ 0.75 per kg).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the one-dimensional cutting stock problem when demand is a random variable. The problem is formulated as a two-stage stochastic nonlinear program with recourse. The first stage decision variables are the number of objects to be cut according to a cutting pattern. The second stage decision variables are the number of holding or backordering items due to the decisions made in the first stage. The problem`s objective is to minimize the total expected cost incurred in both stages, due to waste and holding or backordering penalties. A Simplex-based method with column generation is proposed for solving a linear relaxation of the resulting optimization problem. The proposed method is evaluated by using two well-known measures of uncertainty effects in stochastic programming: the value of stochastic solution-VSS-and the expected value of perfect information-EVPI. The optimal two-stage solution is shown to be more effective than the alternative wait-and-see and expected value approaches, even under small variations in the parameters of the problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mixed models may be defined with or without reference to sampling, and can be used to predict realized random effects, as when estimating the latent values of study subjects measured with response error. When the model is specified without reference to sampling, a simple mixed model includes two random variables, one stemming from an exchangeable distribution of latent values of study subjects and the other, from the study subjects` response error distributions. Positive probabilities are assigned to both potentially realizable responses and artificial responses that are not potentially realizable, resulting in artificial latent values. In contrast, finite population mixed models represent the two-stage process of sampling subjects and measuring their responses, where positive probabilities are only assigned to potentially realizable responses. A comparison of the estimators over the same potentially realizable responses indicates that the optimal linear mixed model estimator (the usual best linear unbiased predictor, BLUP) is often (but not always) more accurate than the comparable finite population mixed model estimator (the FPMM BLUP). We examine a simple example and provide the basis for a broader discussion of the role of conditioning, sampling, and model assumptions in developing inference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When missing data occur in studies designed to compare the accuracy of diagnostic tests, a common, though naive, practice is to base the comparison of sensitivity, specificity, as well as of positive and negative predictive values on some subset of the data that fits into methods implemented in standard statistical packages. Such methods are usually valid only under the strong missing completely at random (MCAR) assumption and may generate biased and less precise estimates. We review some models that use the dependence structure of the completely observed cases to incorporate the information of the partially categorized observations into the analysis and show how they may be fitted via a two-stage hybrid process involving maximum likelihood in the first stage and weighted least squares in the second. We indicate how computational subroutines written in R may be used to fit the proposed models and illustrate the different analysis strategies with observational data collected to compare the accuracy of three distinct non-invasive diagnostic methods for endometriosis. The results indicate that even when the MCAR assumption is plausible, the naive partial analyses should be avoided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prediction of random effects is an important problem with expanding applications. In the simplest context, the problem corresponds to prediction of the latent value (the mean) of a realized cluster selected via two-stage sampling. Recently, Stanek and Singer [Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 119-130] developed best linear unbiased predictors (BLUP) under a finite population mixed model that outperform BLUPs from mixed models and superpopulation models. Their setup, however, does not allow for unequally sized clusters. To overcome this drawback, we consider an expanded finite population mixed model based on a larger set of random variables that span a higher dimensional space than those typically applied to such problems. We show that BLUPs for linear combinations of the realized cluster means derived under such a model have considerably smaller mean squared error (MSE) than those obtained from mixed models, superpopulation models, and finite population mixed models. We motivate our general approach by an example developed for two-stage cluster sampling and show that it faithfully captures the stochastic aspects of sampling in the problem. We also consider simulation studies to illustrate the increased accuracy of the BLUP obtained under the expanded finite population mixed model. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, one of the current control algorithms for the R744 cycle, which tries tooptimize the performance of the system by two SISO control loops, is compared to acost-effective system with just one actuator. The operation of a key component of thissystem, a two stage orifice expansion valve is examined in a range of typical climateconditions. One alternative control loop for this system, which has been proposed byBehr group, is also scrutinized.The simulation results affirm the preference of using two control-loops instead of oneloop, but refute advantages of the Behr alternate control approach against one-loopcontrol. As far as the economic considerations of the A/C unit are concerned, usinga two-stage orifice expansion valve is desired by the automotive industry, thus basedon the experiment results, an improved logic for control of this system is proposed.In the second part, it is investigated whether the one-actuator control approach isapplicable to a system consisting of two parallel evaporators to allow passengers tocontrol different climate zones. The simulation results show that in the case of usinga two-stage orifice valve for the front evaporator and a fixed expansion valve forthe rear one, a proper distribution of the cooling power between the front and rearcompartment is possible for a broad range of climate conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is an application of the Almost Ideal Demand System approach of Deaton and Muellbauer,1980, for a particular pharmaceutical, Citalopram, in which GORMAN´s (1971) multi-stage budgeting approach is applied basically since it is one of the most useful approach in estimating demand for differentiated products. Citalopram is an antidepressant drug that is used in the treatment of major depression. As for most other pharmaceuticals whose the patent has expired, there exist branded and generic versions of Citalopram. This paper is aimed to define its demand system with two stage models for the branded version and five generic versions, and to show whether generic versions are able to compete with the branded version. I calculated the own price elasticities, and it made me possible to compare and make a conclusion about the consumers’ choices over the brand and generic drugs. Even though the models need for being developed with some additional variables, estimation results of models and uncompensated price elasticities indicated that the branded version has still power in the market, and generics are able to compete with lower prices. One important point that has to be taken into consideration is that the Swedish pharmaceutical market faced a reform on October 1, 2002, that aims to make consumer better informed about the price and decrease the overall expenditures for pharmaceuticals. Since there were not significantly enough generic sales to take into calculation before the reform, my paper covers sales after the reform.