850 resultados para Trials (Heresy)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Certain characteristics of some vegetable crops allow multiple harvests during the production cycle; however, to our knowledge, no study has described the behavior of fruit production with progression of the production cycle in vegetable crops with multiple harvests that present data overdispersion. We aimed to characterize the data overdispersion of zero-inflated variables and identify the behavior of these variables during the production cycle of several vegetable crops with multiple harvests. Data from 11 uniformity trials were used without applying treatments; these comprise the database from the Experimental Plants Group at the Federal University of Santa Maria, Brazil. The trials were conducted using four horticultural species grown during different cultivation seasons, cultivation environments, and experimental structures. Although at each harvest, a larger number of basic units with harvest fruit was observed than units without harvest fruit, the basic unit percentage without fruit was high, generating an overdispersion within each individual harvest. The variability within each harvest was high and increased with the evolution of the production cycle of Capsicum annuum, Solanum lycopersicum var. cerasiforme, Phaseolus vulgaris, and Cucurbita pepo species. However, the correlation coefficient between the mean weight and number of harvest fruits tended to remain constant during the crop production cycle. These behaviors show that harvest management should be done individually, at each harvest, such that data overdispersion is reduced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Most large acute stroke trials have been neutral. Functional outcome is usually analysed using a yes or no answer, e.g. death or dependency vs. independence. We assessed which statistical approaches are most efficient in analysing outcomes from stroke trials. Methods: Individual patient data from acute, rehabilitation and stroke unit trials studying the effects of interventions which alter functional outcome were assessed. Outcomes included modified Rankin Scale, Barthel Index, and ‘3 questions’. Data were analysed using a variety of approaches which compare two treatment groups. The results for each statistical test for each trial were then compared. Results: Data from 55 datasets were obtained (47 trials, 54,173 patients). The test results differed substantially so that approaches which use the ordered nature of functional outcome data (ordinal logistic regression, t-test, robust ranks test, bootstrapping the difference in mean rank) were more efficient statistically than those which collapse the data into 2 groups (chi square) (ANOVA p<0.001). The findings were consistent across different types and sizes of trial and for the different measures of functional outcome. Conclusions: When analysing functional outcome from stroke trials, statistical tests which use the original ordered data are more efficient and more likely to yield reliable results. Suitable approaches included ordinal logistic regression, t-test, and robust ranks test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and Purpose—Vascular prevention trials mostly count “yes/no” (binary) outcome events, eg, stroke/no stroke. Analysis of ordered categorical vascular events (eg, fatal stroke/nonfatal stroke/no stroke) is clinically relevant and could be more powerful statistically. Although this is not a novel idea in the statistical community, ordinal outcomes have not been applied to stroke prevention trials in the past. Methods—Summary data on stroke, myocardial infarction, combined vascular events, and bleeding were obtained by treatment group from published vascular prevention trials. Data were analyzed using 10 statistical approaches which allow comparison of 2 ordinal or binary treatment groups. The results for each statistical test for each trial were then compared using Friedman 2-way analysis of variance with multiple comparison procedures. Results—Across 85 trials (335 305 subjects) the test results differed substantially so that approaches which used the ordinal nature of stroke events (fatal/nonfatal/no stroke) were more efficient than those which combined the data to form 2 groups (P0.0001). The most efficient tests were bootstrapping the difference in mean rank, Mann–Whitney U test, and ordinal logistic regression; 4- and 5-level data were more efficient still. Similar findings were obtained for myocardial infarction, combined vascular outcomes, and bleeding. The findings were consistent across different types, designs and sizes of trial, and for the different types of intervention. Conclusions—When analyzing vascular events from prevention trials, statistical tests which use ordered categorical data are more efficient and are more likely to yield reliable results than binary tests. This approach gives additional information on treatment effects by severity of event and will allow trials to be smaller. (Stroke. 2008;39:000-000.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and Purpose—Most large acute stroke trials have been neutral. Functional outcome is usually analyzed using a yes or no answer, eg, death or dependency versus independence. We assessed which statistical approaches are most efficient in analyzing outcomes from stroke trials. Methods—Individual patient data from acute, rehabilitation and stroke unit trials studying the effects of interventions which alter functional outcome were assessed. Outcomes included modified Rankin Scale, Barthel Index, and “3 questions”. Data were analyzed using a variety of approaches which compare 2 treatment groups. The results for each statistical test for each trial were then compared. Results—Data from 55 datasets were obtained (47 trials, 54 173 patients). The test results differed substantially so that approaches which use the ordered nature of functional outcome data (ordinal logistic regression, t test, robust ranks test, bootstrapping the difference in mean rank) were more efficient statistically than those which collapse the data into 2 groups (2; ANOVA, P0.001). The findings were consistent across different types and sizes of trial and for the different measures of functional outcome. Conclusions—When analyzing functional outcome from stroke trials, statistical tests which use the original ordered data are more efficient and more likely to yield reliable results. Suitable approaches included ordinal logistic regression, test, and robust ranks test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Systematic reviews followed by ameta-analysis are carried out in medical research to combine the results of two or more related studies. Stroke trials have struggled to show beneficial effects and meta-analysis should be used more widely throughout the research process to either speed up the development of useful interventions, or halt more quickly research with hazardous or ineffective interventions. Summary of review. This review summarises the clinical research process and illustrates how and when systematic reviews may be used throughout the development programme. Meta-analyses should be performed after observational studies, preclinical studies in experimental stroke, and after phase I, II, and III clinical trials and phase IV clinical surveillance studies. Although meta-analyses most commonly work with summary data, they may be performed to assess relationships between variables (meta-regression) and, ideally, should utilise individual patient data. Meta-analysis techniques may alsoworkwith ordered categorical outcome data (ordinal meta-analysis) and be used to perform indirect comparisons where original trial data do not exist. Conclusion Systematic review/meta-analyses are powerful tools in medical research and should be used throughout the development of all stroke and other interventions

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Chromium is an essential trace mineral for carbohydrate and lipid metabolism, which is currently prescribed to control diabetes mellitus. Results of previous systematic reviews and meta-analyses of chromium supplementation and metabolic profiles in diabetes have been inconsistent. Aim: The objective of this meta-analysis was to assess the effects on metabolic profiles and safety of chromium supplementation in type 2 diabetes mellitus and cholesterol. Methods: Literature searches in PubMed, Scopus and Web of Science were made by use of related terms-keywords and randomized clinical trials during the period of 2000-2014. Results: Thirteen trials fulfilled the inclusion criteria and were included in this systematic review. Total doses of Cr supplementation and brewer's yeast ranged from 42 to 1,000 µg/day, and duration of supplementation ranged from 30 to 120 days. The analysis indicated that there was a significant effect of chromium supplementation in diabetics on fasting plasma glucose with a weighted average effect size of -29.26 mg/dL, p = 0.01, CI 95% = -52.4 to -6.09; and on total cholesterol with a weighted average effect size of -6.7 mg/dL, p = 0.01, CI 95% = -11.88 to -1.53. Conclusions: The available evidence suggests favourable effects of chromium supplementation on glycaemic control in patients with diabetes. Chromium supplementation may additionally improve total cholesterol levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neonatal seizures are common in the neonatal intensive care unit. Clinicians treat these seizures with several anti-epileptic drugs (AEDs) to reduce seizures in a neonate. Current AEDs exhibit sub-optimal efficacy and several randomized control trials (RCT) of novel AEDs are planned. The aim of this study was to measure the influence of trial design on the required sample size of a RCT. We used seizure time courses from 41 term neonates with hypoxic ischaemic encephalopathy to build seizure treatment trial simulations. We used five outcome measures, three AED protocols, eight treatment delays from seizure onset (Td) and four levels of trial AED efficacy to simulate different RCTs. We performed power calculations for each RCT design and analysed the resultant sample size. We also assessed the rate of false positives, or placebo effect, in typical uncontrolled studies. We found that the false positive rate ranged from 5 to 85% of patients depending on RCT design. For controlled trials, the choice of outcome measure had the largest effect on sample size with median differences of 30.7 fold (IQR: 13.7–40.0) across a range of AED protocols, Td and trial AED efficacy (p<0.001). RCTs that compared the trial AED with positive controls required sample sizes with a median fold increase of 3.2 (IQR: 1.9–11.9; p<0.001). Delays in AED administration from seizure onset also increased the required sample size 2.1 fold (IQR: 1.7–2.9; p<0.001). Subgroup analysis showed that RCTs in neonates treated with hypothermia required a median fold increase in sample size of 2.6 (IQR: 2.4–3.0) compared to trials in normothermic neonates (p<0.001). These results show that RCT design has a profound influence on the required sample size. Trials that use a control group, appropriate outcome measure, and control for differences in Td between groups in analysis will be valid and minimise sample size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates how ways of being in different ontologies emerge from material and embodied practice. This general concern is explored through the particular case study of Scotland in the period of the witch trials (the 16th and 17th centuries C.E.). The field of early modern Scottish witchcraft studies has been active and dynamic over the past 15 years but its prioritisation of what people said over what they did leaves a clear gap for a situated and relational approach focusing upon materiality. Such an approach requires a move away from the Cartesian dichotomies of modern ontology to recognise past beliefs as real to those who experienced them, coconstitutive of embodiment and of the material worlds people inhabited. In theory, method and practice, this demands a different way of exploring past worlds to avoid flattening strange data. To this end, the study incorporates narratives and ‘disruptions’ – unique engagements with Contemporary Art which facilitate understanding by enabling the temporary suspension of disbelief. The methodology is iterative, tacking between material and written sources in order to better understand the heterogeneous assemblages of early modern (counter-) witchcraft. Previously separate areas of discourse are (re-)constituted into alternative ontic categories of newly-parallel materials. New interpretations of things, places, bodies and personhoods emerge, raising questions about early modern experiences of the world. Three thematic chapters explore different sets of collaborative agencies as they entwine into new things, co-fabricating a very different world. Moving between witch trial accounts, healing wells, infant burial grounds, animals, discipline artefacts and charms, the boundaries of all prove highly permeable. People, cloth and place bleed into one another through contact; trees and water emerge as powerful agents of magical-place-making; and people and animals meet to become single, hybrid-persons spread over two bodies. Life and death consistently emerge as protracted processes with the capacity to overlap and occur simultaneously in problematic ways. The research presented in this thesis establishes a new way of looking at the nature of Being as experienced by early modern Scots. This provides a foundation for further studies, which can draw in other materials not explored here such as communion wares and metal charms. Comparison with other early modern Western societies may also prove fruitful. Furthermore, the methodology may be suitable for application to other interdisciplinary projects incorporating historical and material evidence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SRI has examined the organosolv (organic solvation) pulping of Australian bagasse using technology supplied by Ecopulp. In the process, bagasse is reacted with aqueous ethanol in a digester at elevated temperatures (between 150ºC and 200ºC). The products from the digester are separated using proprietary technology before further processing into a range of saleable products. Test trials were undertaken using two batch digesters; the first capable of pulping about 25 g of wet depithed bagasse and the second, larger samples of about 1.5 kg of wet depithed bagasse. From this study, the unbleached pulp produced from fresh bagasse did not have very good strength properties for the production of corrugated medium for cartons and bleached pulp. In particular, the lignin contents as indicated by the Kappa number for the unbleached pulps are high for making bleached pulp. However, in spite of the high lignin content, it is possible to bleach the pulp to acceptable levels of brightness up to 86.6% ISO. The economics were assessed for three tier pricing (namely low, medium and high price). The economic return for a plant that produces 100 air dry t/d of brownstock pulp is satisfactory for both high and medium pricing levels of pricing. The outcomes from the project justify that work should continue through to either pilot plant or upgraded laboratory facility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of allograft bone is increasingly common in orthopaedic reconstruction procedures. The optimal method of preparation of allograft bone is subject of great debate. Proponents of fresh-frozen graft cite improved biological and biomechanical characteristics relative to irradiated material, whereas fear of bacterial or viral transmission warrants some to favour irradiated graft. Careful review of the literature is necessary to appreciate the influence of processing techniques on bone quality. Whereas limited clinical trials are available to govern the selection of appropriate bone graft, this review presents the argument favouring the use of fresh-frozen bone allograft as compared to irradiated bone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The basis of treatment for amblyopia (poor vision due to abnormal visual experience early in life) for 250 years has been patching of the unaffected eye for extended times to ensure a period of use of the affected eye. Over the last decade randomised controlled treatment trials have provided some evidence on how to tailor amblyopia therapy more precisely to achieve the best visual outcome with the least negative impact on the patient and the family. This review highlights the expansion of knowledge regarding treatment for amblyopia and aims to provide optometrists with a summary of research evidence to enable them to better treat amblyopia. Treatment for amblyopia is effective, as it reduces overall prevalence and severity of visual loss in this population. Correction of refractive error alone significantly improves visual acuity, sometimes to the point where further amblyopia treatment is not required. Atropine penalisation and patch occlusion are effective in treating amblyopia. Lesser amounts of occlusion or penalisation have been found to be just as effective as greater amounts. Recent evidence has highlighted that occlusion or penalisation in amblyopia treatment can create negative changes in behaviour in children and impact on family life. These complications should be considere when prescribing treatment because they can negatively affect compliance. Studies investigating the maximum age at which treatment of amblyopia can still be effective and the importance of near activities during occlusion are ongoing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single nucleotide polymorphisms (SNPs) are unique genetic differences between individuals that contribute in significant ways to the determination of human variation including physical characteristics like height and appearance as well as less obvious traits such as personality, behaviour and disease susceptibility. SNPs can also significantly influence responses to pharmacotherapy and whether drugs will produce adverse reactions. The development of new drugs can be made far cheaper and more rapid by selecting participants in drug trials based on their genetically determined response to drugs. Technology that can rapidly and inexpensively genotype thousands of samples for thousands of SNPs at a time is therefore in high demand. With the completion of the human genome project, about 12 million true SNPs have been identified to date. However, most have not yet been associated with disease susceptibility or drug response. Testing for the appropriate drug response SNPs in a patient requiring treatment would enable individualised therapy with the right drug and dose administered correctly the first time. Many pharmaceutical companies are also interested in identifying SNPs associated with polygenic traits so novel therapeutic targets can be discovered. This review focuses on technologies that can be used for genotyping known SNPs as well as for the discovery of novel SNPs associated with drug response.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Use of Unmanned Aerial Vehicles (UAVs) in support of government applications has already seen significant growth and the potential for use of UAVs in commercial applications is expected to rapidly expand in the near future. However, the issue remains on how such automated or operator-controlled aircraft can be safely integrated into current airspace. If the goal of integration is to be realized, issues regarding safe separation in densely populated airspace must be investigated. This paper investigates automated separation management concepts in uncontrolled airspace that may help prepare for an expected growth of UAVs in Class G airspace. Not only are such investigations helpful for the UAV integration issue, the automated separation management concepts investigated by the authors can also be useful for the development of new or improved Air Traffic Control services in remote regions without any existing infrastructure. The paper will also provide an overview of the Smart Skies program and discuss the corresponding Smart Skies research and development effort to evaluate aircraft separation management algorithms using simulations involving realworld data communication channels, and verified against actual flight trials. This paper presents results from a unique flight test concept that uses real-time flight test data from Australia over existing commercial communication channels to a control center in Seattle for real-time separation management of actual and simulated aircraft. The paper also assesses the performance of an automated aircraft separation manager.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent initiatives around the world have highlighted the potential for information and communications technology (ICT) to foster better service delivery for businesses. Likewise, ICT has also been applied to government services and is seen to result in improved service delivery, improved citizen participation in government, and enhanced cooperation across government departments and between government departments. The Council of Australian Governments (COAG) (2006) identified local government development assessment (DA) arrangements as a ‘hot spot’ needing specific attention, as the inconsistent policies and regulations between councils impeded regional economic activity. COAG (2006) specifically suggested that trials of various ICT mechanisms be initiated which may well be able to improve DA processes for local government. While the authors have explored various regulatory mechanisms to improve harmonisation elsewhere (Brown and Furneaux 2007), the possibility of ICT being able to enhance consistency across governments is a novel notion from a public policy perspective. Consequently, this paper will explore the utility of ICT initiatives to improve harmonisation of DA across local governments. This paper examines as a case study the recent attempt to streamline Development Assessment (DA) in local governments in South East Queensland. This initiative was funded by the Regulation Reduction Incentive Fund (RRIF), and championed by the South East Queensland (SEQ) Council of Mayors. The Regulation Reduction Incentive Fund (RRIF) program was created by the Australian government with the aim to provide incentives to local councils to reduce red tape for small and medium sized businesses. The funding for the program was facilitated through a competitive merit-based grants process targeted at Local Government Authorities. Grants were awarded to projects which targeted specific areas identified for reform (AusIndustry, 2007), in SEQ this focused around improving DA processes and creating transparency in environmental health policies, regulation and compliance. An important key factor to note with this case study is that it is unusual for an eGovernment initiative. Typically individual government departments undertake eGovernment projects in order to improve their internal performance. The RRIF case study examines the implementation of an eGovernment initiative across 21 autonomous local councils in South East Queensland. In order to move ahead, agreement needed to be reached between councils at the highest level. Having reviewed the concepts of eGovernment and eGovernance, the literature review is undertaken to identify the typical cost and benefits, barriers and enablers of ICT projects in government. The specific case of the RRIF project is then examined to determine if similar costs and benefits, barriers and enablers could be found in the RRIF project. The outcomes of the project, particularly in reducing red tape by increasing harmonisation between councils are explored.