823 resultados para whether court has power to extend time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of phase II single-arm clinical trials of a new drug is to determine whether it has sufficient promising activity to warrant its further development. For the last several years Bayesian statistical methods have been proposed and used. Bayesian approaches are ideal for earlier phase trials as they take into account information that accrues during a trial. Predictive probabilities are then updated and so become more accurate as the trial progresses. Suitable priors can act as pseudo samples, which make small sample clinical trials more informative. Thus patients have better chances to receive better treatments. The goal of this paper is to provide a tutorial for statisticians who use Bayesian methods for the first time or investigators who have some statistical background. In addition, real data from three clinical trials are presented as examples to illustrate how to conduct a Bayesian approach for phase II single-arm clinical trials with binary outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The eukaryotic genome is a mosaic of eubacterial and archaeal genes in addition to those unique to itself. The mosaic may have arisen as the result of two prokaryotes merging their genomes, or from genes acquired from an endosymbiont of eubacterial origin. A third possibility is that the eukaryotic genome arose from successive events of lateral gene transfer over long periods of time. This theory does not exclude the endosymbiont, but questions whether it is necessary to explain the peculiar set of eukaryotic genes. We use phylogenetic studies and reconstructions of ancestral first appearances of genes on the prokaryotic phylogeny to assess evidence for the lateral gene transfer scenario. We find that phylogenies advanced to support fusion can also arise from a succession of lateral gene transfer events. Our reconstructions of ancestral first appearances of genes reveal that the various genes that make up the eukaryotic mosaic arose at different times and in diverse lineages on the prokaryotic tree, and were not available in a single lineage. Successive events of lateral gene transfer can explain the unusual mosaic structure of the eukaryotic genome, with its content linked to the immediate adaptive value of the genes its acquired. Progress in understanding eukaryotes may come from identifying ancestral features such as the eukaryotic splicesome that could explain why this lineage invaded, or created, the eukaryoticniche.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large number of processes are involved in the pathogenesis of atherosclerosis but it is unclear which of them play a rate-limiting role. One way of resolving this problem is to investigate the highly non-uniform distribution of disease within the arterial system; critical steps in lesion development should be revealed by identifying arterial properties that differ between susceptible and protected sites. Although the localisation of atherosclerotic lesions has been investigated intensively over much of the 20th century, this review argues that the factor determining the distribution of human disease has only recently been identified. Recognition that the distribution changes with age has, for the first time, allowed it to be explained by variation in transport properties of the arterial wall; hitherto, this view could only be applied to experimental atherosclerosis in animals. The newly discovered transport variations which appear to play a critical role in the development of adult disease have underlying mechanisms that differ from those elucidated for the transport variations relevant to experimental atherosclerosis: they depend on endogenous NO synthesis and on blood flow. Manipulation of transport properties might have therapeutic potential. Copyright (C) 2004 S. Karger AG, Basel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The gut and immune system form a complex integrated structure that has evolved to provide effective digestion and defence against ingested toxins and pathogenic bacteria. However, great variation exists in what is considered normal healthy gut and immune function. Thus, whilst it is possible to measure many aspects of digestion and immunity, it is more difficult to interpret the benefits to individuals of variation within what is considered to be a normal range. Nevertheless, it is important to set standards for optimal function for use both by the consumer, industry and those concerned with the public health. The digestive tract is most frequently the object of functional and health claims and a large market already exists for gut-functional foods worldwide. Aim To define normal function of the gut and immune system and describe available methods of measuring it. Results We have defined normal bowel habit and transit time, identified their role as risk factors for disease and how they may be measured. Similarly, we have tried to define what is a healthy gut flora in terms of the dominant genera and their metabolism and listed the many, varied and novel methods for determining these parameters. It has proved less easy to provide boundaries for what constitutes optimal or improved gastric emptying, gut motility, nutrient and water absorption and the function of organs such as the liver, gallbladder and pancreas. The many tests of these functions are described. We have discussed gastrointestinal well being. Sensations arising from the gut can be both pleasant and unpleasant. However, the characteristics of well being are ill defined and merge imperceptibly from acceptable to unacceptable, a state that is subjective. Nevertheless, we feel this is an important area for future work and method development. The immune system is even more difficult to make quantitative judgements about. When it is defective, then clinical problems ensure, but this is an uncommon state. The innate and adaptive immune systems work synergistically together and comprise many cellular and humoral factors. The adaptive system is extremely sophisticated and between the two arms of immunity there is great redundancy, which provides robust defences. New aspects of immune function are discovered regularly. It is not clear whether immune function can be "improved". Measuring aspects of immune function is possible but there is no one test that will define either the status or functional capacity of the immune system. Human studies are often limited by the ability to sample only blood or secretions such as saliva but it should be remembered that only 2% of lymphocytes circulate at any given time, which limits interpretation of data. We recommend assessing the functional capacity of the immune system by: measuring specific cell functions ex vivo, measuring in vivo responses to challenge, e. g. change in antibody in blood or response to antigens, determining the incidence and severity of infection in target populations during naturally occurring episodes or in response to attenuated pathogens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ever since man invented writing he has used text to store and distribute his thoughts. With the advent of computers and the Internet the delivery of these messages has become almost instant. Textual conversations can now be had regardless of location or distance. Advances in computational power for 3D graphics are enabling Virtual Environments(VE) within which users can become increasingly more immersed. By opening these environments to other users such as initially through sharing these text conversations channels, we aim to extend the immersed experience into an online virtual community. This paper examines work that brings textual communications into the VE, enabling interaction between the real and virtual worlds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a real-time multi-camera surveillance system that can be applied to a range of application domains. This integrated system is designed to observe crowded scenes and has mechanisms to improve tracking of objects that are in close proximity. The four component modules described in this paper are (i) motion detection using a layered background model, (ii) object tracking based on local appearance, (iii) hierarchical object recognition, and (iv) fused multisensor object tracking using multiple features and geometric constraints. This integrated approach to complex scene tracking is validated against a number of representative real-world scenarios to show that robust, real-time analysis can be performed. Copyright (C) 2007 Hindawi Publishing Corporation. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the UK, participation in higher education has risen over the past two decades, along with a shift of the costs of higher education onto the individual and a move to widening participation among previously underrepresented groups. This has led to changes in the way individuals fund their higher education, in particular a rise in the incidence of term time employment. Term time employment potentially plays a much bigger role than in the past, both as a means for individuals to fund their education and reduce debt, and as a way to gain valuable work experience and increase employability. With the increase in the number of graduates in the UK labour market it is now more important for individuals to be able to differentiate themselves in the labour market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thoughtful construction of molecular switches has led to a gamut of supramolecular systems that can be used in molecular electronics. These include molecules based on thienylethenes, spiropyrans, fulgides, dithienylphenanthrolines, and diazafluorenes. This article reviews the recent developments made in the synthesis and characterization of all these systems, thereby allowing a comparative study to validate the viability of these switchable molecules on a nanoscale. Also, the drawbacks of each class are demonstrated and, at the same time, the remedies for further improvisation are prescribed. We have made an honest attempt to present at? exhaustive account of all the different photochromic switches developed by us hitherto.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This letter argues that the current controversy about whether Wbuoyancy, the power input due to the surface buoyancy fluxes, is large or small in the oceans stems from two distinct and incompatible views on how Wbuoyancy relates to the volume-integrated work of expansion/contraction B. The current prevailing view is that Wbuoyancy should be identified with the net value of B, which current theories estimate to be small. The alternative view, defended here, is that only the positive part of B, i.e., the one converting internal energy into mechanical energy, should enter the definition of Wbuoyancy, since the negative part of B is associated with the non-viscous dissipation of mechanical energy. Two indirect methods suggest that by contrast, the positive part of B is potentially large.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I was born human. But this was an accident of fate - a condition merely of time and place. I believe it's something we have the power to change. I will tell you why. In August 1998, a silicon chip was implanted in my arm, allowing a computer to monitor me as I moved through the halls and offices of the Department of Cybernetics at the University of Reading, just west of London, where I've been a professor since 1988. My implant communicated via radio waves with a network of antennas throughout the department that in turn transmitted the signals to a computer programmed to respond to my actions. At the main entrance, a voice box operated by the computer said "Hello" when I entered; the computer detected my progress through the building, opening the door to my lab for me as I approached it and switching on the lights. For the nine days the implant was in place, I performed seemingly magical acts simply by walking in a particular direction. The aim of this experiment was to determine whether information could be transmitted to and from an implant. Not only did we succeed, but the trial demonstrated how the principles behind cybernetics could perform in real-life applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The scaling of metabolic rates to body size is widely considered to be of great biological and ecological importance, and much attention has been devoted to determining its theoretical and empirical value. Most debate centers on whether the underlying power law describing metabolic rates is 2/3 (as predicted by scaling of surface area/volume relationships) or 3/4 ("Kleiber's law"). Although recent evidence suggests that empirically derived exponents vary among clades with radically different metabolic strategies, such as ectotherms and endotherms, models, such as the metabolic theory of ecology, depend on the assumption that there is at least a predominant, if not universal, metabolic scaling exponent. Most analyses claimed to support the predictions of general models, however, failed to control for phylogeny. We used phylogenetic generalized least-squares models to estimate allometric slopes for both basal metabolic rate (BMR) and field metabolic rate (FMR) in mammals. Metabolic rate scaling conformed to no single theoretical prediction, but varied significantly among phylogenetic lineages. In some lineages we found a 3/4 exponent, in others a 2/3 exponent, and in yet others exponents differed significantly from both theoretical values. Analysis of the phylogenetic signal in the data indicated that the assumptions of neither species-level analysis nor independent contrasts were met. Analyses that assumed no phylogenetic signal in the data (species-level analysis) or a strong phylogenetic signal (independent contrasts), therefore, returned estimates of allometric slopes that were erroneous in 30% and 50% of cases, respectively. Hence, quantitative estimation of the phylogenetic signal is essential for determining scaling exponents. The lack of evidence for a predominant scaling exponent in these analyses suggests that general models of metabolic scaling, and macro-ecological theories that depend on them, have little explanatory power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper,the Prony's method is applied to the time-domain waveform data modelling in the presence of noise.The following three problems encountered in this work are studied:(1)determination of the order of waveform;(2)de-termination of numbers of multiple roots;(3)determination of the residues.The methods of solving these problems are given and simulated on the computer.Finally,an output pulse of model PG-10N signal generator and the distorted waveform obtained by transmitting the pulse above mentioned through a piece of coaxial cable are modelled,and satisfactory results are obtained.So the effectiveness of Prony's method in waveform data modelling in the presence of noise is confirmed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advantages of standard bus systems have been appreciated for many years. The ability to connect only those modules required to perform a given task has both technical and commercial advantages over a system with a fixed architecture which cannot be easily expanded or updated. Although such bus standards have proliferated in the microprocessor field, a general purpose low-cost standard for digital video processing has yet to gain acceptance. The paper describes the likely requirements of such a system, and discusses three currently available commercial systems. A new bus specification known as Vidibus, developed to fulfil these requirements, is presented. Results from applications already implemented using this real-time bus system are also given.