111 resultados para SEQUENTIAL CONVERGENCE
Resumo:
Dose-finding trials are a form of clinical data collection process in which the primary objective is to estimate an optimum dose of an investigational new drug when given to a patient. This thesis develops and explores three novel dose-finding design methodologies. All design methodologies presented in this thesis are pragmatic. They use statistical models, incorporate clinicians' prior knowledge efficiently, and prematurely stop a trial for safety or futility reasons. Designing actual dose-finding trials using these methodologies will minimize practical difficulties, improve efficiency of dose estimation, be flexible to stop early and reduce possible patient discomfort or harm.
Resumo:
Sequential Design Molecular Weight Range Functional Monomers: Possibilities, Limits, and Challenges Block Copolymers: Combinations, Block Lengths, and Purities Modular Design End-Group Chemistry Ligation Protocols Conclusions
Resumo:
We investigated memories of room-sized spatial layouts learned by sequentially or simultaneously viewing objects from a stationary position. In three experiments, sequential viewing (one or two objects at a time) yielded subsequent memory performance that was equivalent or superior to simultaneous viewing of all objects, even though sequential viewing lacked direct access to the entire layout. This finding was replicated by replacing sequential viewing with directed viewing in which all objects were presented simultaneously and participants’ attention was externally focused on each object sequentially, indicating that the advantage of sequential viewing over simultaneous viewing may have originated from focal attention to individual object locations. These results suggest that memory representation of object-to-object relations can be constructed efficiently by encoding each object location separately, when those locations are defined within a single spatial reference system. These findings highlight the importance of considering object presentation procedures when studying spatial learning mechanisms.
Resumo:
Objective To evaluate methods for monitoring monthly aggregated hospital adverse event data that display clustering, non-linear trends and possible autocorrelation. Design Retrospective audit. Setting The Northern Hospital, Melbourne, Australia. Participants 171,059 patients admitted between January 2001 and December 2006. Measurements The analysis is illustrated with 72 months of patient fall injury data using a modified Shewhart U control chart, and charts derived from a quasi-Poisson generalised linear model (GLM) and a generalised additive mixed model (GAMM) that included an approximate upper control limit. Results The data were overdispersed and displayed a downward trend and possible autocorrelation. The downward trend was followed by a predictable period after December 2003. The GLM-estimated incidence rate ratio was 0.98 (95% CI 0.98 to 0.99) per month. The GAMM-fitted count fell from 12.67 (95% CI 10.05 to 15.97) in January 2001 to 5.23 (95% CI 3.82 to 7.15) in December 2006 (p<0.001). The corresponding values for the GLM were 11.9 and 3.94. Residual plots suggested that the GLM underestimated the rate at the beginning and end of the series and overestimated it in the middle. The data suggested a more rapid rate fall before 2004 and a steady state thereafter, a pattern reflected in the GAMM chart. The approximate upper two-sigma equivalent control limit in the GLM and GAMM charts identified 2 months that showed possible special-cause variation. Conclusion Charts based on GAMM analysis are a suitable alternative to Shewhart U control charts with these data.
Resumo:
Total factor productivity plays an important role in the growth of the Indian economy. Using state-level data from 1993 to 2005 that were recently made available, we find widespread regional variation in productivity changes. In the years immediately following economic liberalization, productivity growth improved technical efficiency; however, in subsequent years, productivity growth was propelled by technological progress. We find a tendency toward convergence with regard to productivity growth among states; however, the states that were technically efficient when the economic reforms were instituted remained innovative in later years.
Resumo:
While economic theory acknowledges that some features of technology (e.g., indivisibilities, economies of scale and specialization) can fundamentally violate the traditional convexity assumption, almost all empirical studies accept the convexity property on faith. In this contribution, we apply two alternative flexible production technologies to measure total factor productivity growth and test the significance of the convexity axiom using a nonparametric test of closeness between unknown distributions. Based on unique field level data on the petroleum industry, the empirical results reveal significant differences, indicating that this production technology is most likely non-convex. Furthermore, we also show the impact of convexity on answers to traditional convergence questions in the productivity growth literature.
Resumo:
This paper presents our system to address the CogALex-IV 2014 shared task of identifying a single word most semantically related to a group of 5 words (queries). Our system uses an implementation of a neural language model and identifies the answer word by finding the most semantically similar word representation to the sum of the query representations. It is a fully unsupervised system which learns on around 20% of the UkWaC corpus. It correctly identifies 85 exact correct targets out of 2,000 queries, 285 approximate targets in lists of 5 suggestions.
Resumo:
Abstract Within the field of Information Systems, a good proportion of research is concerned with the work organisation and this has, to some extent, restricted the kind of application areas given consideration. Yet, it is clear that information and communication technology deployments beyond the work organisation are acquiring increased importance in our lives. With this in mind, we offer a field study of the appropriation of an online play space known as Habbo Hotel. Habbo Hotel, as a site of media convergence, incorporates social networking and digital gaming functionality. Our research highlights the ethical problems such a dual classification of technology may bring. We focus upon a particular set of activities undertaken within and facilitated by the space – scamming. Scammers dupe members with respect to their ‘Furni’, virtual objects that have online and offline economic value. Through our analysis we show that sometimes, online activities are bracketed off from those defined as offline and that this can be related to how the technology is classified by members – as a social networking site and/or a digital game. In turn, this may affect members’ beliefs about rights and wrongs. We conclude that given increasing media convergence, the way forward is to continue the project of educating people regarding the difficulties of determining rights and wrongs, and how rights and wrongs may be acted out with respect to new technologies of play online and offline.
Resumo:
In the past decade, policymakers in over 70 markets have introduced corporate governance codes or best practice guidelines. In East Asia, they have been introduced in Hong Kong in 1999 and 2006, Indonesia in 2000 and 2007, Malaysia in 2000 and 2007, the Philippines in 2002, Singapore iu 2001 and 2005, South Korea in 2003, Taiwan iu 2002 and Thailand iu 2006. The common focus of these codes is to encourage but not force companies to improve their corporate governance practices to a specified target level, e.g., board independence of 30%. Another commonality is that the guidelines apply to all listed companies regardless of their ownership structure or other characteristics.
Resumo:
A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.
Resumo:
This paper analyses recent corporate governance codes issued by 20 countries for evidence of convergence in corporate governance systems in Europe. The analysis shows that there has been a degree of convergence towards an Anglo-Saxon model of corporate governance as the audit committee concept is widely accepted in countries with both unitary and two-tier governance systems. Further, the latest audit committee recommendations in countries that have issued several governance codes show a strengthening of the recommendations for an audit committee over time in line with the Anglo-Saxon audit committee concept and convergence with the debate in the US and UK on issues such as the independence and financial expertise of members. However, consistent with the literature on the convergence of European corporate governance systems, at an operational level there is limited consistency in the recommended structure and role of audit committees.