928 resultados para automation of fit analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Goodness-of-fit tests have been studied by many researchers. Among them, an alternative statistical test for uniformity was proposed by Chen and Ye (2009). The test was used by Xiong (2010) to test normality for the case that both location parameter and scale parameter of the normal distribution are known. The purpose of the present thesis is to extend the result to the case that the parameters are unknown. A table for the critical values of the test statistic is obtained using Monte Carlo simulation. The performance of the proposed test is compared with the Shapiro-Wilk test and the Kolmogorov-Smirnov test. Monte-Carlo simulation results show that proposed test performs better than the Kolmogorov-Smirnov test in many cases. The Shapiro Wilk test is still the most powerful test although in some cases the test proposed in the present research performs better.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Policy makers are often called upon to navigate between scientists’ urgent calls for long-term concerted action to reduce the environmental impacts due to resource use, and the public’s concerns over policies that threaten lifestyles or jobs. Against these political challenges, resource efficiency policy making is often a changeable and even chaotic process, which has fallen short of the political ambitions set by democratically elected governments. This article examines the importance of paradigms in understanding how the public collectively responds to new policy proposals, such as those developed within the project DYNAmic policy MiXes for absolute decoupling of environmental impact of EU resource use from economic growth (DYNAMIX). The resulting proposed approach provides a framework to understand how different concerns and worldviews converge within public discourse, potentially resulting in paradigm change. Thus an alternative perspective on how resource efficiency policy can be development is proposed, which envisages early policies to lay the ground for future far-reaching policies, by altering the underlying paradigm context in which the public receive and respond to policy. The article concludes by arguing that paradigm change is more likely if the policy is conceived, framed, designed, analyzed, presented, and evaluated from the worldview or paradigm pathway that it seeks to create (i.e. the destination paradigm).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Radium-223 dichloride (radium-223), a first-in-class α-emitting radiopharmaceutical, is recommended in both pre- and post-docetaxel settings in patients with castration-resistant prostate cancer (CRPC) and symptomatic bone metastases based on overall survival benefit demonstrated in the phase III ALSYMPCA study. ALSYMPCA included prospective measurements of health-related quality of life (QOL) using two validated instruments: the general EuroQoL 5D (EQ-5D) and the disease-specific Functional Assessment of Cancer Therapy-Prostate (FACT-P).

PATIENTS AND METHODS: Analyses were conducted to determine treatment effects of radium-223 plus standard of care (SOC) versus placebo plus SOC on QOL using FACT-P and EQ-5D. Outcomes assessed were percentage of patients experiencing improvement, percentage of patients experiencing worsening, and mean QOL scores during the study.

RESULTS: Analyses were carried out on the intent-to-treat population of patients randomized to receive radium-223 (n = 614) or placebo (n = 307). The mean baseline EQ-5D utility and FACT-P total scores were similar between treatment groups. A significantly higher percentage of patients receiving radium-223 experienced meaningful improvement in EQ-5D utility score on treatment versus placebo {29.2% versus 18.5%, respectively; P = 0.004; odds ratio (OR) = 1.82 [95% confidence interval (CI) 1.21-2.74]}. Findings were similar for FACT-P total score [24.6% versus 16.1%, respectively; P = 0.020; OR = 1.70 (95% CI 1.08-2.65)]. A lower percentage of patients receiving radium-223 experienced meaningful worsening versus placebo measured by EQ-5D utility score and FACT-P total score. Prior docetaxel use and current bisphosphonate use did not affect these findings. Treatment was a significant predictor of EQ-5D utility score, with radium-223 associated with higher scores versus placebo (0.56 versus 0.50, respectively; P = 0.002). Findings were similar for FACT-P total score (99.08 versus 95.22, respectively; P = 0.004).

CONCLUSIONS: QOL data from ALSYMPCA demonstrated that improved survival with radium-223 is accompanied by significant QOL benefits, including a higher percentage of patients with meaningful QOL improvement and a slower decline in QOL over time in patients with CRPC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Beef businesses in northern Australia are facing increased pressure to be productive and profitable with challenges such as climate variability and poor financial performance over the past decade. Declining terms of trade, limited recent gains in on-farm productivity, low profit margins under current management systems and current climatic conditions will leave little capacity for businesses to absorb climate change-induced losses. In order to generate a whole-of-business focus towards management change, the Climate Clever Beef project in the Maranoa-Balonne region of Queensland trialled the use of business analysis with beef producers to improve financial literacy, provide a greater understanding of current business performance and initiate changes to current management practices. Demonstration properties were engaged and a systematic approach was used to assess current business performance, evaluate impacts of management changes on the business and to trial practices and promote successful outcomes to the wider industry. Focus was concentrated on improving financial literacy skills, understanding the business’ key performance indicators and modifying practices to improve both business productivity and profitability. To best achieve the desired outcomes, several extension models were employed: the ‘group facilitation/empowerment model’, the ‘individual consultant/mentor model’ and the ‘technology development model’. Providing producers with a whole-of-business approach and using business analysis in conjunction with on-farm trials and various extension methods proved to be a successful way to encourage producers in the region to adopt new practices into their business, in the areas of greatest impact. The areas targeted for development within businesses generally led to improvements in animal performance and grazing land management further improving the prospects for climate resilience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction to microorganisms and foodborne diseases. Activities in a Food Microbiology Laboratory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growing interest for constellation of small, less expensive satellites is bringing space junk and traffic management to the attention of space community. At the same time, the continuous quest for more efficient propulsion systems put the spotlight on electric (low thrust) propulsion as an appealing solution for collision avoidance. Starting with an overview of the current techniques for conjunction assessment and avoidance, we then highlight the possible problems when a low thrust propulsion is used. The need for accurate propagation model shows up from the conducted simulations. Thus, aiming at propagation models with low computational burden, we study the available models from the literature and propose an analytical alternative to improve propagation accuracy. The model is then tested in the particular case of a tangential maneuver. Results show that the proposed solution significantly improve on state of the art methods and is a good candidate to be used in collision avoidance operations. For instance to propagate satellite uncertainty or optimizing avoidance maneuver when conjunction occurs within few (3-4) orbits from measurements time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Ph.D. project, original and innovative approaches for the quali-quantitative analysis of abuse substances, as well as therapeutic agents with abuse potential and related compounds were designed, developed and validated for application to different fields such as forensics, clinical and pharmaceutical. All the parameters involved in the developed analytical workflows were properly and accurately optimised, from sample collection to sample pretreatment up to the instrumental analysis. Advanced dried blood microsampling technologies have been developed, able of bringing several advantages to the method as a whole, such as significant reduction of solvent use, feasible storage and transportation conditions and enhancement of analyte stability. At the same time, the use of capillary blood allows to increase subject compliance and overall method applicability by exploiting such innovative technologies. Both biological and non-biological samples involved in this project were subjected to optimised pretreatment techniques developed ad-hoc for each target analyte, making also use of advanced microextraction techniques. Finally, original and advanced instrumental analytical methods have been developed based on high and ultra-high performance liquid chromatography (HPLC,UHPLC) coupled to different detection means (mainly mass spectrometry, but also electrochemical, and spectrophotometric detection for screening purpose), and on attenuated total reflectance-Fourier transform infrared spectroscopy (ATR-FTIR) for solid-state analysis. Each method has been designed to obtain highly selective, sensitive yet sustainable systems and has been validated according to international guidelines. All the methods developed herein proved to be suitable for the analysis of the compounds under investigation and may be useful tools in medicinal chemistry, pharmaceutical analysis, within clinical studies and forensic investigations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Artificial Intelligence (AI) is gaining ever more ground in every sphere of human life, to the point that it is now even used to pass sentences in courts. The use of AI in the field of Law is however deemed quite controversial, as it could provide more objectivity yet entail an abuse of power as well, given that bias in algorithms behind AI may cause lack of accuracy. As a product of AI, machine translation is being increasingly used in the field of Law too in order to translate laws, judgements, contracts, etc. between different languages and different legal systems. In the legal setting of Company Law, accuracy of the content and suitability of terminology play a crucial role within a translation task, as any addition or omission of content or mistranslation of terms could entail legal consequences for companies. The purpose of the present study is to first assess which neural machine translation system between DeepL and ModernMT produces a more suitable translation from Italian into German of the atto costitutivo of an Italian s.r.l. in terms of accuracy of the content and correctness of terminology, and then to assess which translation proves to be closer to a human reference translation. In order to achieve the above-mentioned aims, two human and automatic evaluations are carried out based on the MQM taxonomy and the BLEU metric. Results of both evaluations show an overall better performance delivered by ModernMT in terms of content accuracy, suitability of terminology, and closeness to a human translation. As emerged from the MQM-based evaluation, its accuracy and terminology errors account for just 8.43% (as opposed to DeepL’s 9.22%), while it obtains an overall BLEU score of 29.14 (against DeepL’s 27.02). The overall performances however show that machines still face barriers in overcoming semantic complexity, tackling polysemy, and choosing domain-specific terminology, which suggests that the discrepancy with human translation may still be remarkable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nested clade phylogeographic analysis (NCPA) is a popular method for reconstructing the demographic history of spatially distributed populations from genetic data. Although some parts of the analysis are automated, there is no unique and widely followed algorithm for doing this in its entirety, beginning with the data, and ending with the inferences drawn from the data. This article describes a method that automates NCPA, thereby providing a framework for replicating analyses in an objective way. To do so, a number of decisions need to be made so that the automated implementation is representative of previous analyses. We review how the NCPA procedure has evolved since its inception and conclude that there is scope for some variability in the manual application of NCPA. We apply the automated software to three published datasets previously analyzed manually and replicate many details of the manual analyses, suggesting that the current algorithm is representative of how a typical user will perform NCPA. We simulate a large number of replicate datasets for geographically distributed, but entirely random-mating, populations. These are then analyzed using the automated NCPA algorithm. Results indicate that NCPA tends to give a high frequency of false positives. In our simulations we observe that 14% of the clades give a conclusive inference that a demographic event has occurred, and that 75% of the datasets have at least one clade that gives such an inference. This is mainly due to the generation of multiple statistics per clade, of which only one is required to be significant to apply the inference key. We survey the inferences that have been made in recent publications and show that the most commonly inferred processes (restricted gene flow with isolation by distance and contiguous range expansion) are those that are commonly inferred in our simulations. However, published datasets typically yield a richer set of inferences with NCPA than obtained in our random-mating simulations, and further testing of NCPA with models of structured populations is necessary to examine its accuracy.