999 resultados para extension publication


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Generic object-oriented programming languages combine parametric polymorphism and nominal subtype polymorphism, thereby providing better data abstraction, greater code reuse, and fewer run-time errors. However, most generic object-oriented languages provide a straightforward combination of the two kinds of polymorphism, which prevents the expression of advanced type relationships. Furthermore, most generic object-oriented languages have a type-erasure semantics: instantiations of type parameters are not available at run time, and thus may not be used by type-dependent operations. This dissertation shows that two features, which allow the expression of many advanced type relationships, can be added to a generic object-oriented programming language without type erasure: 1. type variables that are not parameters of the class that declares them, and 2. extension that is dependent on the satisfiability of one or more constraints. We refer to the first feature as hidden type variables and the second feature as conditional extension. Hidden type variables allow: covariance and contravariance without variance annotations or special type arguments such as wildcards; a single type to extend, and inherit methods from, infinitely many instantiations of another type; a limited capacity to augment the set of superclasses after that class is defined; and the omission of redundant type arguments. Conditional extension allows the properties of a collection type to be dependent on the properties of its element type. This dissertation describes the semantics and implementation of hidden type variables and conditional extension. A sound type system is presented. In addition, a sound and terminating type checking algorithm is presented. Although designed for the Fortress programming language, hidden type variables and conditional extension can be incorporated into other generic object-oriented languages. Many of the same problems would arise, and solutions analogous to those we present would apply.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The last 30 years have seen Fuzzy Logic (FL) emerging as a method either complementing or challenging stochastic methods as the traditional method of modelling uncertainty. But the circumstances under which FL or stochastic methods should be used are shrouded in disagreement, because the areas of application of statistical and FL methods are overlapping with differences in opinion as to when which method should be used. Lacking are practically relevant case studies comparing these two methods. This work compares stochastic and FL methods for the assessment of spare capacity on the example of pharmaceutical high purity water (HPW) utility systems. The goal of this study was to find the most appropriate method modelling uncertainty in industrial scale HPW systems. The results provide evidence which suggests that stochastic methods are superior to the methods of FL in simulating uncertainty in chemical plant utilities including HPW systems in typical cases whereby extreme events, for example peaks in demand, or day-to-day variation rather than average values are of interest. The average production output or other statistical measures may, for instance, be of interest in the assessment of workshops. Furthermore the results indicate that the stochastic model should be used only if found necessary by a deterministic simulation. Consequently, this thesis concludes that either deterministic or stochastic methods should be used to simulate uncertainty in chemical plant utility systems and by extension some process system because extreme events or the modelling of day-to-day variation are important in capacity extension projects. Other reasons supporting the suggestion that stochastic HPW models are preferred to FL HPW models include: 1. The computer code for stochastic models is typically less complex than a FL models, thus reducing code maintenance and validation issues. 2. In many respects FL models are similar to deterministic models. Thus the need for a FL model over a deterministic model is questionable in the case of industrial scale HPW systems as presented here (as well as other similar systems) since the latter requires simpler models. 3. A FL model may be difficult to "sell" to an end-user as its results represent "approximate reasoning" a definition of which is, however, lacking. 4. Stochastic models may be applied with some relatively minor modifications on other systems, whereas FL models may not. For instance, the stochastic HPW system could be used to model municipal drinking water systems, whereas the FL HPW model should or could not be used on such systems. This is because the FL and stochastic model philosophies of a HPW system are fundamentally different. The stochastic model sees schedule and volume uncertainties as random phenomena described by statistical distributions based on either estimated or historical data. The FL model, on the other hand, simulates schedule uncertainties based on estimated operator behaviour e.g. tiredness of the operators and their working schedule. But in a municipal drinking water distribution system the notion of "operator" breaks down. 5. Stochastic methods can account for uncertainties that are difficult to model with FL. The FL HPW system model does not account for dispensed volume uncertainty, as there appears to be no reasonable method to account for it with FL whereas the stochastic model includes volume uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivated by accurate average-case analysis, MOdular Quantitative Analysis (MOQA) is developed at the Centre for Efficiency Oriented Languages (CEOL). In essence, MOQA allows the programmer to determine the average running time of a broad class of programmes directly from the code in a (semi-)automated way. The MOQA approach has the property of randomness preservation which means that applying any operation to a random structure, results in an output isomorphic to one or more random structures, which is key to systematic timing. Based on original MOQA research, we discuss the design and implementation of a new domain specific scripting language based on randomness preserving operations and random structures. It is designed to facilitate compositional timing by systematically tracking the distributions of inputs and outputs. The notion of a labelled partial order (LPO) is the basic data type in the language. The programmer uses built-in MOQA operations together with restricted control flow statements to design MOQA programs. This MOQA language is formally specified both syntactically and semantically in this thesis. A practical language interpreter implementation is provided and discussed. By analysing new algorithms and data restructuring operations, we demonstrate the wide applicability of the MOQA approach. Also we extend MOQA theory to a number of other domains besides average-case analysis. We show the strong connection between MOQA and parallel computing, reversible computing and data entropy analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the Internet has changed communication, commerce, and the distribution of information, so it is changing Information Systems Research (ISR). The goal of this paper is to put the topic of application and reliability of online research into the focus of ISR by exploring the extension of online research methods (ORM) into its popular publication outlets. 513 articles from high ranked ISR publication outlets from the last decade have been analyzed using online content analysis. The findings show that in ISR online research methods are applied despite the missing discussion on the validity of the theories and methods that were defined offline within the new environment and the associated challenges.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Anterior cruciate ligament (ACL) reconstruction is associated with a high incidence of second tears (graft tears and contralateral ACL tears). These secondary tears have been attributed to asymmetrical lower extremity mechanics. Knee bracing is one potential intervention that can be used during rehabilitation that has the potential to normalize lower extremity asymmetry; however, little is known about the effect of bracing on movement asymmetry in patients following ACL reconstruction. HYPOTHESIS: Wearing a knee brace would increase knee joint flexion and joint symmetry. It was also expected that the joint mechanics would become more symmetrical in the braced condition. OBJECTIVE: To examine how knee bracing affects knee joint function and symmetry over the course of rehabilitation in patients 6 months following ACL reconstruction. STUDY DESIGN: Controlled laboratory study. LEVEL OF EVIDENCE: Level 3. METHODS: Twenty-three adolescent patients rehabilitating from ACL reconstruction surgery were recruited for the study. The subjects all underwent a motion analysis assessment during a stop-jump activity with and without a functional knee brace on the surgical side that resisted extension for 6 months following the ACL reconstruction surgery. Statistical analysis utilized a 2 × 2 (limb × brace) analysis of variance with a significant alpha level of 0.05. RESULTS: Subjects had increased knee flexion on the surgical side when they were braced. The brace condition increased knee flexion velocity, decreased the initial knee flexion angle, and increased the ground reaction force and knee extension moment on both limbs. Side-to-side asymmetry was present across conditions for the vertical ground reaction force and knee extension moment. CONCLUSION: Wearing a knee brace appears to increase lower extremity compliance and promotes normalized loading on the surgical side. CLINICAL RELEVANCE: Knee extension constraint bracing in postoperative ACL patients may improve symmetry of lower extremity mechanics, which is potentially beneficial in progressing rehabilitation and reducing the incidence of second ACL tears.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: When the nature and direction of research results affect their chances of publication, a distortion of the evidence base - termed publication bias - results. Despite considerable recent efforts to implement measures to reduce the non-publication of trials, publication bias is still a major problem in medical research. The objective of our study was to identify barriers to and facilitators of interventions to prevent or reduce publication bias. METHODS: We systematically reviewed the scholarly literature and extracted data from articles. Further, we performed semi-structured interviews with stakeholders. We performed an inductive thematic analysis to identify barriers to and facilitators of interventions to counter publication bias. RESULTS: The systematic review identified 39 articles. Thirty-four of 89 invited interview partners agreed to be interviewed. We clustered interventions into four categories: prospective trial registration, incentives for reporting in peer-reviewed journals or research reports, public availability of individual patient-level data, and peer-review/editorial processes. Barriers we identified included economic and personal interests, lack of financial resources for a global comprehensive trial registry, and different legal systems. Facilitators identified included: raising awareness of the effects of publication bias, providing incentives to make data publically available, and implementing laws to enforce prospective registration and reporting of clinical trial results. CONCLUSIONS: Publication bias is a complex problem that reflects the complex system in which it occurs. The cooperation amongst stakeholders to increase public awareness of the problem, better tailoring of incentives to publish, and ultimately legislative regulations have the greatest potential for reducing publication bias.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The experiments in the Cole and Moore article in the first issue of the Biophysical Journal provided the first independent experimental confirmation of the Hodgkin-Huxley (HH) equations. A log-log plot of the K current versus time showed that raising the HH variable n to the sixth power provided the best fit to the data. Subsequent simulations using n(6) and setting the resting potential at the in vivo value simplifies the HH equations by eliminating the leakage term. Our article also reported that the K current in response to a depolarizing step to ENa was delayed if the step was preceded by a hyperpolarization. While the interpretation of this phenomenon in the article was flawed, subsequent simulations show that the effect completely arises from the original HH equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: We tested the hypothesis that intraventricular hemorrhage (IVH) is associated with incontinence and gait disturbance among survivors of intracerebral hemorrhage (ICH) at 3-month follow-ups. METHODS: The Genetic and Environmental Risk Factors for Hemorrhagic Stroke study was used as the discovery set. The Ethnic/Racial Variations of Intracerebral Hemorrhage study served as a replication set. Both studies performed prospective hot-pursuit recruitment of ICH cases with 3-month follow-up. Multivariable logistic regression analyses were computed to identify risk factors for incontinence and gait dysmobility at 3 months after ICH. RESULTS: The study population consisted of 307 ICH cases in the discovery set and 1,374 cases in the replication set. In the discovery set, we found that increasing IVH volume was associated with incontinence (odds ratio [OR] 1.50; 95% confidence interval [CI] 1.10-2.06) and dysmobility (OR 1.58; 95% CI 1.17-2.15) after controlling for ICH location, initial ICH volume, age, baseline modified Rankin Scale score, sex, and admission Glasgow Coma Scale score. In the replication set, increasing IVH volume was also associated with both incontinence (OR 1.42; 95% CI 1.27-1.60) and dysmobility (OR 1.40; 95% CI 1.24-1.57) after controlling for the same variables. CONCLUSION: ICH subjects with IVH extension are at an increased risk for developing incontinence and dysmobility after controlling for factors associated with severity and disability. This finding suggests a potential target to prevent or treat long-term disability after ICH with IVH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new general cell-centered solution procedure based upon the conventional control or finite volume (CV or FV) approach has been developed for numerical heat transfer and fluid flow which encompasses both structured and unstructured meshes for any kind of mixed polygon cell. Unlike conventional FV methods for structured and block structured meshes and both FV and FE methods for unstructured meshes, the irregular control volume (ICV) method does not require the shape of the element or cell to be predefined because it simply exploits the concept of fluxes across cell faces. That is, the ICV method enables meshes employing mixtures of triangular, quadrilateral, and any other higher order polygonal cells to be exploited using a single solution procedure. The ICV approach otherwise preserves all the desirable features of conventional FV procedures for a structured mesh; in the current implementation, collocation of variables at cell centers is used with a Rhie and Chow interpolation (to suppress pressure oscillation in the flow field) in the context of the SIMPLE pressure correction solution procedure. In fact all other FV structured mesh-based methods may be perceived as a subset of the ICV formulation. The new ICV formulation is benchmarked using two standard computational fluid dynamics (CFD) problems i.e., the moving lid cavity and the natural convection driven cavity. Both cases were solved with a variety of structured and unstructured meshes, the latter exploiting mixed polygonal cell meshes. The polygonal mesh experiments show a higher degree of accuracy for equivalent meshes (in nodal density terms) using triangular or quadrilateral cells; these results may be interpreted in a manner similar to the CUPID scheme used in structured meshes for reducing numerical diffusion for flows with changing direction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The classical Purcell's vector method, for the construction of solutions to dense systems of linear equations is extended to a flexible orthogonalisation procedure. Some properties are revealed of the orthogonalisation procedure in relation to the classical Gauss-Jordan elimination with or without pivoting. Additional properties that are not shared by the classical Gauss-Jordan elimination are exploited. Further properties related to distributed computing are discussed with applications to panel element equations in subsonic compressible aerodynamics. Using an orthogonalisation procedure within panel methods enables a functional decomposition of the sequential panel methods and leads to a two-level parallelism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The growth of computer power allows the solution of complex problems related to compressible flow, which is an important class of problems in modern day CFD. Over the last 15 years or so, many review works on CFD have been published. This book concerns both mathematical and numerical methods for compressible flow. In particular, it provides a clear cut introduction as well as in depth treatment of modern numerical methods in CFD. This book is organised in two parts. The first part consists of Chapters 1 and 2, and is mainly devoted to theoretical discussions and results. Chapter 1 concerns fundamental physical concepts and theoretical results in gas dynamics. Chapter 2 describes the basic mathematical theory of compressible flow using the inviscid Euler equations and the viscous Navier–Stokes equations. Existence and uniqueness results are also included. The second part consists of modern numerical methods for the Euler and Navier–Stokes equations. Chapter 3 is devoted entirely to the finite volume method for the numerical solution of the Euler equations and covers fundamental concepts such as order of numerical schemes, stability and high-order schemes. The finite volume method is illustrated for 1-D as well as multidimensional Euler equations. Chapter 4 covers the theory of the finite element method and its application to compressible flow. A section is devoted to the combined finite volume–finite element method, and its background theory is also included. Throughout the book numerous examples have been included to demonstrate the numerical methods. The book provides a good insight into the numerical schemes, theoretical analysis, and validation of test problems. It is a very useful reference for applied mathematicians, numerical analysts, and practice engineers. It is also an important reference for postgraduate researchers in the field of scientific computing and CFD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the late 1970s the western academy has encouraged the development of postcolonial literary theory and the formulation of a postcolonial literary canon existing outside the prescriptive narratives of the ‘mother’ country and empire. Having lost faith in the binary oppositions underpinning such narratives, we turned to alternative fictions that contested the construction of the ‘other’, the world divided between the ‘West and the Rest’. The publication of Edward Said’s Orientalism in 1978 marked the beginning of the discipline now known as postcolonial studies with its new ways of understanding ‘the west’s’ relationship with ‘the east’ and, by extension, all the former colonies of empire. Despite these radical origins, however, postcolonialism’s more recent emphasis on the psychological and its affirmation of the hybrid text and self has, for many, served to obscure real economic social realities that have very little to do with the magical or wondrous textual expression of a postcolonial identity. This paper considers problems associated with defining the postcolonial and proposes that, in a literary context, we broaden its meaning to include texts traditionally outside the category of postcolonial literature. To extend the meaning of postcolonial is timely as we are now witnessing its relocation from ‘margin’ to ‘centre’ with the election of Barack Obama. This moment may be seen as a disruption of conventional understandings of what constitutes postcolonial literature, essentially as oppositional discourse that could only define itself as peripheral to, or ‘post’, metropolitan and economic concerns. [From the Author]