974 resultados para Explanatory Sequential Design
Resumo:
Sequential Design Molecular Weight Range Functional Monomers: Possibilities, Limits, and Challenges Block Copolymers: Combinations, Block Lengths, and Purities Modular Design End-Group Chemistry Ligation Protocols Conclusions
Resumo:
A sequential design method is presented for the design of thermally coupled distillation sequences. The algorithm starts by selecting a set of sequences in the space of basic configurations in which the internal structure of condensers and reboilers is explicitly taken into account and extended with the possibility of including divided wall columns (DWC). This first stage is based on separation tasks (except by the DWCs) and therefore it does not provide an actual sequence of columns. In the second stage the best arrangement in N-1 actual columns is performed taking into account operability and mechanical constraints. Finally, for a set of candidate sequences the algorithm try to reduce the number of total columns by considering Kaibel columns, elimination of transfer blocks or columns with vertical partitions. An example illustrate the different steps of the sequential algorithm.
Resumo:
In this paper we present a sequential Monte Carlo algorithm for Bayesian sequential experimental design applied to generalised non-linear models for discrete data. The approach is computationally convenient in that the information of newly observed data can be incorporated through a simple re-weighting step. We also consider a flexible parametric model for the stimulus-response relationship together with a newly developed hybrid design utility that can produce more robust estimates of the target stimulus in the presence of substantial model and parameter uncertainty. The algorithm is applied to hypothetical clinical trial or bioassay scenarios. In the discussion, potential generalisations of the algorithm are suggested to possibly extend its applicability to a wide variety of scenarios
Resumo:
This research aimed to gain a sophisticated understanding of self-disclosure on Facebook across two distinctive cultures, Saudi Arabia and Australia. This study utilised an explanatory sequential mixed methods design, consisting of a quantitative phase followed by a qualitative phase. Findings from both quantitative and qualitative data provide a broad understanding of the types of information that people self-disclose on Facebook, identifies factors that have a significant influence (either positive or negative) on such disclosure, and explains how it is affected by one's national culture.
Resumo:
The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.
Resumo:
Yao, Begg, and Livingston (1996, Biometrics 52, 992-1001) considered the optimal group size for testing a series of potentially therapeutic agents to identify a promising one as soon as possible for given error rates. The number of patients to be tested with each agent was fixed as the group size. We consider a sequential design that allows early acceptance and rejection, and we provide an optimal strategy to minimize the sample sizes (patients) required using Markov decision processes. The minimization is under the constraints of the two types (false positive and false negative) of error probabilities, with the Lagrangian multipliers corresponding to the cost parameters for the two types of errors. Numerical studies indicate that there can be a substantial reduction in the number of patients required.
Resumo:
Despite the popularity of the Theory of Planned Behaviour (TPB) a lack of research assessing the efficacy of the model in understanding the health behaviour of children exists, with those studies that have been conducted reporting problems with questionnaire formulation and low to moderate internal consistencies for TPB constructs. The aim of this study was to develop and test a TPB-based measure suitable for use with primary school children aged 9 to 10 years. A mixed method sequential design was employed. In Stage 1, 7 semi-structured focus group discussions (N=56) were conducted to elicit the underlying beliefs specific to tooth brushing. Using content thematic analysis the beliefs were identified and a TPB measure was developed. A repeated measures design was employed in Stage 2 using test re-test reliability analysis in order to assess its psychometric properties. In all, 184 children completed the questionnaire. Test-retest reliabilities support the validity and reliability of the TPB measure for assessing the tooth brushing beliefs of children. Pearson’s product moment correlations were calculated for all of the TPB beliefs, achieving substantial to almost perfect agreement levels. Specifically, a significant relationship between all 10 of the direct and indirect TPB constructs at the 0.01 level was achieved. This paper will discuss the design and development of the measure so could serve as a guide to fellow researchers and health psychologists interested in using theoretical models to investigate the health and well-being of children.
Resumo:
This is a study of the implementation and impact of formative assessment strategies on the motivation and self-efficacy of secondary school mathematics students. An explanatory sequential mixed methods design was implemented where quantitative and qualitative data were collected and analyzed sequentially in 2 different phases. The first phase involved quantitative data from student questionnaires and the second phase involved qualitative data from individual student and teacher interviews. The findings of the study suggest that formative assessment is implemented in practice in diverse ways and is a process where the strategies are interconnected. Teachers experience difficulty in incorporating peer and self-assessment and perceive a need for exemplars. Key factors described as influencing implementation include teaching philosophies, interpretation of ministry documents, teachers’ experiences, leadership in administration and department, teacher collaboration, misconceptions of teachers, and student understanding of formative assessment. Findings suggest that overall, formative assessment positively impacts student motivation and self-efficacy, because feedback is provided which offers encouragement and recognition by highlighting the progress that has been made and what steps need to be taken to improve. However, students are impacted differently with some considerations including how students perceive mistakes and if they fear judgement. Additionally, the impact of formative assessment is influenced by the connection between self-efficacy and motivation, namely how well a student is doing is a source of both concepts.
Resumo:
Les pratiques relationnelles de soin (PRS) sont au cœur même des normes et valeurs professionnelles qui définissent la qualité de l’exercice infirmier, mais elles sont souvent compromises par un milieu de travail défavorable. La difficulté pour les infirmières à actualiser ces PRS qui s’inscrivent dans les interactions infirmière-patient par un ensemble de comportements de caring, constitue une menace à la qualité des soins, tout en créant d’importantes frustrations pour les infirmières. En mettant l’accent sur l’aspect relationnel du processus infirmier, cette recherche, abordée sous l'angle du caring, renvoie à une vision novatrice de la qualité des soins et de l'organisation des services en visant à expliquer l’impact du climat organisationnel sur le façonnement des PRS et la satisfaction professionnelle d’infirmières soignantes en milieu hospitalier. Cette étude prend appui sur une adaptation du Quality-Caring Model© de Duffy et Hoskins (2003) qui combine le modèle d’évaluation de la qualité de Donabedian (1980, 1992) et la théorie du Human Caring de Watson (1979, 1988). Un devis mixte de type explicatif séquentiel, combinant une méthode quantitative de type corrélationnel prédictif et une méthode qualitative de type étude de cas unique avec niveaux d’analyse imbriqués, a été privilégié. Pour la section quantitative auprès d’infirmières soignantes (n = 292), différentes échelles de mesure validées, de type Likert ont permis de mesurer les variables suivantes : le climat organisationnel (global et cinq dimensions composites) ; les PRS privilégiées ; les PRS actuelles ; l’écart entre les PRS privilégiées et actuelles ; la satisfaction professionnelle. Des analyses de régression linéaire hiérarchique ont permis de répondre aux six hypothèses du volet quantitatif. Pour le volet qualitatif, les données issues des sources documentaires, des commentaires recueillis dans les questionnaires et des entrevues effectuées auprès de différents acteurs (n = 15) ont été traités de manière systématique, par analyse de contenu, afin d’expliquer les liens entre les notions d’intérêts. L’intégration des inférences quantitatives et qualitatives s’est faite selon une approche de complémentarité. Nous retenons du volet quantitatif qu’une fois les variables de contrôle prises en compte, seule une dimension composite du climat organisationnel, soit les caractéristiques de la tâche, expliquent 5 % de la variance des PRS privilégiées. Le climat organisationnel global et ses dimensions composites relatives aux caractéristiques du rôle, de l’organisation, du supérieur et de l’équipe sont de puissants facteurs explicatifs des PRS actuelles (5 % à 11 % de la variance), de l’écart entre les PRS privilégiées et actuelles (4 % à 9 %) ainsi que de la satisfaction professionnelle (13 % à 30 %) des infirmières soignantes. De plus, il a été démontré, qu’au-delà de l’important impact du climat organisationnel global et des variables de contrôle, la fréquence des PRS contribue à augmenter la satisfaction professionnelle des infirmières (ß = 0,31 ; p < 0,001), alors que l’écart entre les PRS privilégiées et actuelles contribue à la diminuer (ß = - 0,30 ; p < 0,001) dans des proportions fort similaires (respectivement 7 % et 8 %). Le volet qualitatif a permis de mettre en relief quatre ordres de facteurs qui expliquent comment le climat organisationnel façonne les PRS et la satisfaction professionnelle des infirmières. Ces facteurs sont: 1) l’intensité de la charge de travail; 2) l’approche d’équipe et la perception du rôle infirmier ; 3) la perception du supérieur et de l’organisation; 4) certaines caractéristiques propres aux patients/familles et à l’infirmière. L’analyse de ces facteurs a révélé d’intéressantes interactions dynamiques entre quatre des cinq dimensions composites du climat, suggérant ainsi qu’il soit possible d’influencer une dimension en agissant sur une autre. L’intégration des inférences quantitatives et qualitatives rend compte de l’impact prépondérant des caractéristiques du rôle sur la réalisation des PRS et la satisfaction professionnelle des infirmières, tout en suggérant d’adopter une approche systémique qui mise sur de multiples facteurs dans la mise en oeuvre d’interventions visant l’amélioration des environnements de travail infirmier en milieu hospitalier.
Resumo:
A study or experiment can be described as sequential if its design includes one or more interim analyses at which it is possible to stop the study, having reached a definitive conclusion concerning the primary question of interest. The potential of the sequential study to terminate earlier than the equivalent fixed sample size study means that, typically, there are ethical and economic advantages to be gained from using a sequential design. These advantages have secured a place for the methodology in the conduct of many clinical trials of novel therapies. Recently, there has been increasing interest in pharmacogenetics: the study of how DNA variation in the human genome affects the safety and efficacy of drugs. The potential for using sequential methodology in pharmacogenetic studies is considered and the conduct of candidate gene association studies, family-based designs and genome-wide association studies within the sequential setting is explored. The objective is to provide a unified framework for the conduct of these types of studies as sequential designs and hence allow experimenters to consider using sequential methodology in their future pharmacogenetic studies.
Resumo:
This sequential explanatory, mixed methods research design examines the role teachers should enact in the development process of the teacher evaluation system in Louisiana. These insights will ensure teachers are catalysts in the classroom to significantly increase student achievement and allow policymakers, practitioners, and instructional leaders to direct as learned decision makers.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.
Resumo:
In this essay, I present a reflective and generative analysis of Business Process Management research, in which I analyze process management and the surrounding research program from the viewpoint of a theoretical paradigm embracing analytical, empirical, explanatory and design elements. I argue that this view not only reconciles different perceptions of BPM and different research streams, but that it also informs ways in which the BPM research program could develop into a much richer, more inclusive and overall more significant body of work than it has to date. I define three perspectives on a BPM research agenda, give several examples of exciting existing research, and offer key opportunities for further research that can (a) strengthen the core of BPM, (b) generate novel theory from BPM in relevant and topical big issue domains, and (c) explore more rigorously and comprehensively the protective belt of BPM assumptions that much of the present research abides by. The essay ends with some recommendations for continuing the debate about what constitutes BPM and some suggestions for how future research in this area might be carried out.
Resumo:
Stallard (1998, Biometrics 54, 279-294) recently used Bayesian decision theory for sample-size determination in phase II trials. His design maximizes the expected financial gains in the development of a new treatment. However, it results in a very high probability (0.65) of recommending an ineffective treatment for phase III testing. On the other hand, the expected gain using his design is more than 10 times that of a design that tightly controls the false positive error (Thall and Simon, 1994, Biometrics 50, 337-349). Stallard's design maximizes the expected gain per phase II trial, but it does not maximize the rate of gain or total gain for a fixed length of time because the rate of gain depends on the proportion: of treatments forwarding to the phase III study. We suggest maximizing the rate of gain, and the resulting optimal one-stage design becomes twice as efficient as Stallard's one-stage design. Furthermore, the new design has a probability of only 0.12 of passing an ineffective treatment to phase III study.
Resumo:
Aim: Diabetes is an important barometer of health system performance. This chronic condition is a source of significant morbidity, premature mortality and a major contributor to health care costs. There is an increasing focus internationally, and more recently nationally, on system, practice and professional-level initiatives to promote the quality of care. The aim of this thesis was to investigate the ‘quality chasm’ around the organisation and delivery of diabetes care in general practice, to explore GPs’ attitudes to engaging in quality improvement activities and to examine efforts to improve the quality of diabetes care in Ireland from practice to policy. Methods: Quantitative and qualitative methods were used. As part of a mixed methods sequential design, a postal survey of 600 GPs was conducted to assess the organization of care. This was followed by an in-depth qualitative study using semi-structured interviews with a purposive sample of 31 GPs from urban and rural areas. The qualitative methodology was also used to examine GPs’ attitudes to engaging in quality improvement. Data were analysed using a Framework approach. A 2nd observation study was used to assess the quality of care in 63 practices with a special interest in diabetes. Data on 3010 adults with Type 2 diabetes from 3 primary care initiatives were analysed and the results were benchmarked against national guidelines and standards of care in the UK. The final study was an instrumental case study of policy formulation. Semi-structured interviews were conducted with 15 members of the Expert Advisory Group (EAG) for Diabetes. Thematic analysis was applied to the data using 3 theories of the policy process as analytical tools. Results: The survey response rate was 44% (n=262). Results suggested care delivery was largely unstructured; 45% of GPs had a diabetes register (n=157), 53% reported using guidelines (n=140), 30% had formal call recall system (n=78) and 24% had none of these organizational features (n=62). Only 10% of GPs had a formal shared protocol with the local hospital specialist diabetes team (n=26). The lack of coordination between settings was identified as a major barrier to providing optimal care leading to waiting times, overburdened hospitals and avoidable duplication. The lack of remuneration for chronic disease management had a ripple effect also creating costs for patients and apathy among GPs. There was also a sense of inertia around quality improvement activities particularly at a national level. This attitude was strongly influenced by previous experiences of change in the health system. In contrast GP’s spoke positively about change at a local level which was facilitated by a practice ethos, leadership and special interest in diabetes. The 2nd quantitative study found that practices with a special interest in diabetes achieved a standard of care comparable to the UK in terms of the recording of clinical processes of care and the achievement of clinical targets; 35% of patients reached the HbA1c target of <6.5% compared to 26% in England and Wales. With regard to diabetes policy formulation, the evolving process of action and inaction was best described by the Multiple Streams Theory. Within the EAG, the formulation of recommendations was facilitated by overarching agreement on the “obvious” priorities while the details of proposals were influenced by personal preferences and local capacity. In contrast the national decision-making process was protracted and ambiguous. The lack of impetus from senior management coupled with the lack of power conferred on the EAG impeded progress. Conclusions: The findings highlight the inconsistency of diabetes care in Ireland. The main barriers to optimal diabetes management center on the organization and coordination of care at the systems level with consequences for practice, providers and patients. Quality improvement initiatives need to stimulate a sense of ownership and interest among frontline service providers to address the local sense of inertia to national change. To date quality improvement in diabetes care has been largely dependent the “special interest” of professionals. The challenge for the Irish health system is to embed this activity as part of routine practice, professional responsibility and the underlying health care culture.