890 resultados para real option analysis
Resumo:
Although LH is essential for survival and function of the corpus luteum (CL) in higher primates, luteolysis occurs during nonfertile cycles without a discernible decrease in circulating LH levels. Using genome-wide expression analysis, several experiments were performed to examine the processes of luteolysis and rescue of luteal function in monkeys. Induced luteolysis with GnRH receptor antagonist (Cetrorelix) resulted in differential regulation of 3949 genes, whereas replacement with exogenous LH (Cetrorelix plus LH) led to regulation of 4434 genes (1563 down-regulation and 2871 up-regulation). A model system for prostaglandin (PG) F-2 alpha-induced luteolysis in the monkey was standardized and demonstrated that PGF(2 alpha) regulated expression of 2290 genes in the CL. Analysis of the LH-regulated luteal transcriptome revealed that 120 genes were regulated in an antagonistic fashion by PGF(2 alpha). Based on the microarray data, 25 genes were selected for validation by real-time RT-PCR analysis, and expression of these genes was also examined in the CL throughout the luteal phase and from monkeys treated with human chorionic gonadotropin (hCG) to mimic early pregnancy. The results indicated changes in expression of genes favorable to PGF(2 alpha) action during the late to very late luteal phase, and expressions of many of these genes were regulated in an opposite manner by exogenous hCG treatment. Collectively, the findings suggest that curtailment of expression of downstream LH-target genes possibly through PGF(2 alpha) action on the CL is among the mechanisms underlying cross talk between the luteotropic and luteolytic signaling pathways that result in the cessation of luteal function, but hCG is likely to abrogate the PGF(2 alpha)-responsive gene expression changes resulting in luteal rescue crucial for the maintenance of early pregnancy. (Endocrinology 150: 1473-1484, 2009)
Resumo:
A major question in current network science is how to understand the relationship between structure and functioning of real networks. Here we present a comparative network analysis of 48 wasp and 36 human social networks. We have compared the centralisation and small world character of these interaction networks and have studied how these properties change over time. We compared the interaction networks of (1) two congeneric wasp species (Ropalidia marginata and Ropalidia cyathiformis), (2) the queen-right (with the queen) and queen-less (without the queen) networks of wasps, (3) the four network types obtained by combining (1) and (2) above, and (4) wasp networks with the social networks of children in 36 classrooms. We have found perfect (100%) centralisation in a queen-less wasp colony and nearly perfect centralisation in several other queen-less wasp colonies. Note that the perfectly centralised interaction network is quite unique in the literature of real-world networks. Differences between the interaction networks of the two wasp species are smaller than differences between the networks describing their different colony conditions. Also, the differences between different colony conditions are larger than the differences between wasp and children networks. For example, the structure of queen-right R. marginata colonies is more similar to children social networks than to that of their queen-less colonies. We conclude that network architecture depends more on the functioning of the particular community than on taxonomic differences (either between two wasp species or between wasps and humans).
Resumo:
Selection criteria and misspecification tests for the intra-cluster correlation structure (ICS) in longitudinal data analysis are considered. In particular, the asymptotical distribution of the correlation information criterion (CIC) is derived and a new method for selecting a working ICS is proposed by standardizing the selection criterion as the p-value. The CIC test is found to be powerful in detecting misspecification of the working ICS structures, while with respect to the working ICS selection, the standardized CIC test is also shown to have satisfactory performance. Some simulation studies and applications to two real longitudinal datasets are made to illustrate how these criteria and tests might be useful.
Resumo:
We consider ranked-based regression models for clustered data analysis. A weighted Wilcoxon rank method is proposed to take account of within-cluster correlations and varying cluster sizes. The asymptotic normality of the resulting estimators is established. A method to estimate covariance of the estimators is also given, which can bypass estimation of the density function. Simulation studies are carried out to compare different estimators for a number of scenarios on the correlation structure, presence/absence of outliers and different correlation values. The proposed methods appear to perform well, in particular, the one incorporating the correlation in the weighting achieves the highest efficiency and robustness against misspecification of correlation structure and outliers. A real example is provided for illustration.
Resumo:
Solid materials can exist in different physical structures without a change in chemical composition. This phenomenon, known as polymorphism, has several implications on pharmaceutical development and manufacturing. Various solid forms of a drug can possess different physical and chemical properties, which may affect processing characteristics and stability, as well as the performance of a drug in the human body. Therefore, knowledge and control of the solid forms is fundamental to maintain safety and high quality of pharmaceuticals. During manufacture, harsh conditions can give rise to unexpected solid phase transformations and therefore change the behavior of the drug. Traditionally, pharmaceutical production has relied on time-consuming off-line analysis of production batches and finished products. This has led to poor understanding of processes and drug products. Therefore, new powerful methods that enable real time monitoring of pharmaceuticals during manufacturing processes are greatly needed. The aim of this thesis was to apply spectroscopic techniques to solid phase analysis within different stages of drug development and manufacturing, and thus, provide a molecular level insight into the behavior of active pharmaceutical ingredients (APIs) during processing. Applications to polymorph screening and different unit operations were developed and studied. A new approach to dissolution testing, which involves simultaneous measurement of drug concentration in the dissolution medium and in-situ solid phase analysis of the dissolving sample, was introduced and studied. Solid phase analysis was successfully performed during different stages, enabling a molecular level insight into the occurring phenomena. Near-infrared (NIR) spectroscopy was utilized in screening of polymorphs and processing-induced transformations (PITs). Polymorph screening was also studied with NIR and Raman spectroscopy in tandem. Quantitative solid phase analysis during fluidized bed drying was performed with in-line NIR and Raman spectroscopy and partial least squares (PLS) regression, and different dehydration mechanisms were studied using in-situ spectroscopy and partial least squares discriminant analysis (PLS-DA). In-situ solid phase analysis with Raman spectroscopy during dissolution testing enabled analysis of dissolution as a whole, and provided a scientific explanation for changes in the dissolution rate. It was concluded that the methods applied and studied provide better process understanding and knowledge of the drug products, and therefore, a way to achieve better quality.
Resumo:
There is a need for better understanding of the processes and new ideas to develop traditional pharmaceutical powder manufacturing procedures. Process analytical technology (PAT) has been developed to improve understanding of the processes and establish methods to monitor and control processes. The interest is in maintaining and even improving the whole manufacturing process and the final products at real-time. Process understanding can be a foundation for innovation and continuous improvement in pharmaceutical development and manufacturing. New methods are craved for to increase the quality and safety of the final products faster and more efficiently than ever before. The real-time process monitoring demands tools, which enable fast and noninvasive measurements with sufficient accuracy. Traditional quality control methods have been laborious and time consuming and they are performed off line i.e. the analysis has been removed from process area. Vibrational spectroscopic methods are responding this challenge and their utilisation have increased a lot during the past few years. In addition, other methods such as colour analysis can be utilised in noninvasive real-time process monitoring. In this study three pharmaceutical processes were investigated: drying, mixing and tabletting. In addition tablet properties were evaluated. Real-time monitoring was performed with NIR and Raman spectroscopies, colour analysis, particle size analysis and compression data during tabletting was evaluated using mathematical modelling. These methods were suitable for real-time monitoring of pharmaceutical unit operations and increase the knowledge of the critical parameters in the processes and the phenomena occurring during operations. They can improve our process understanding and therefore, finally, enhance the quality of final products.
Resumo:
The approach of generalized estimating equations (GEE) is based on the framework of generalized linear models but allows for specification of a working matrix for modeling within-subject correlations. The variance is often assumed to be a known function of the mean. This article investigates the impacts of misspecifying the variance function on estimators of the mean parameters for quantitative responses. Our numerical studies indicate that (1) correct specification of the variance function can improve the estimation efficiency even if the correlation structure is misspecified; (2) misspecification of the variance function impacts much more on estimators for within-cluster covariates than for cluster-level covariates; and (3) if the variance function is misspecified, correct choice of the correlation structure may not necessarily improve estimation efficiency. We illustrate impacts of different variance functions using a real data set from cow growth.
Resumo:
The Macroscopic Fundamental Diagram (MFD) relates space-mean density and flow. Since the MFD represents the area-wide network traffic performance, studies on perimeter control strategies and network-wide traffic state estimation utilising the MFD concept have been reported. Most previous works have utilised data from fixed sensors, such as inductive loops, to estimate the MFD, which can cause biased estimation in urban networks due to queue spillovers at intersections. To overcome the limitation, recent literature reports the use of trajectory data obtained from probe vehicles. However, these studies have been conducted using simulated datasets; limited works have discussed the limitations of real datasets and their impact on the variable estimation. This study compares two methods for estimating traffic state variables of signalised arterial sections: a method based on cumulative vehicle counts (CUPRITE), and one based on vehicles’ trajectory from taxi Global Positioning System (GPS) log. The comparisons reveal some characteristics of taxi trajectory data available in Brisbane, Australia. The current trajectory data have limitations in quantity (i.e., the penetration rate), due to which the traffic state variables tend to be underestimated. Nevertheless, the trajectory-based method successfully captures the features of traffic states, which suggests that the trajectories from taxis can be a good estimator for the network-wide traffic states.
Resumo:
The potential for large-scale use of a sensitive real time reverse transcription polymerase chain reaction (RT-PCR) assay was evaluated for the detection of Tomato spotted wilt virus (TSWV) in single and bulked leaf samples by comparing its sensitivity with that of DAS-ELISA. Using total RNA extracted with RNeasy® or leaf soak methods, real time RT-PCR detected TSWV in all infected samples collected from 16 horticultural crop species (including flowers, herbs and vegetables), two arable crop species, and four weed species by both assays. In samples in which DAS-ELISA had previously detected TSWV, real time RT-PCR was effective at detecting it in leaf tissues of all 22 plant species tested at a wide range of concentrations. Bulk samples required more robust and extensive extraction methods with real time RT-PCR, but it generally detected one infected sample in 1000 uninfected ones. By contrast, ELISA was less sensitive when used to test bulked samples, once detecting up to 1 infected in 800 samples with pepper but never detecting more than 1 infected in 200 samples in tomato and lettuce. It was also less reliable than real time RT-PCR when used to test samples from parts of the leaf where the virus concentration was low. The genetic variability among Australian isolates of TSWV was small. Direct sequencing of a 587 bp region of the nucleoprotein gene (S RNA) of 29 isolates from diverse crops and geographical locations yielded a maximum of only 4.3% nucleotide sequence difference. Phylogenetic analysis revealed no obvious groupings of isolates according to geographic origin or host species. TSWV isolates, that break TSWV resistance genes in tomato or pepper did not differ significantly in the N gene region studied, indicating that a different region of the virus genome is responsible for this trait.
Resumo:
Globalisation is set to have a major impact on world horticultural production and distribution of fruit and vegetables throughout the world. In contrast to developing countries such as China, production and consumption of fresh fruit and vegetables in most developed countries is relatively static. For developed countries, we are starting to see consolidation in the number of farms producing fruit and vegetables with falling or static prices and real farm incomes. Global supply chains are now dominated by a few large multi-national retailers supplied by preferred trans-national distribution companies. The major competitive advantages that are emerging are consistency of supply of high quality product over an extended season and the control of genetic resources and their marketing. To capture these new competitive advantages, new strategic analyses and planning processes must be implemented. In the past, strategic analyses and planning has been undertaken on an ad hoc basis without accurate global intelligence. In the future, working ‘on the supply chain’ will become equally, if not more important, than working ‘in the supply chain’. A revised approach to strategic planning, which encompasses and adjusts for the changes caused by globalisation, is urgently needed. A new 6-step strategic analyses process is described.
Resumo:
Equid herpesvirus 1 (EHV1) is a major disease of equids worldwide causing considerable losses to the horse industry. A variety of techniques, including PCR have been used to diagnose EHV1. Some of these PCRs were used in combination with other techniques such as restriction enzyme analysis (REA) or hybridisation, making them cumbersome for routine diagnostic testing and increasing the chances of cross-contamination. Furthermore, they involve the use of suspected carcinogens such as ethidium bromide and ultraviolet light. In this paper, we describe a real-time PCR, which uses minor groove-binding probe (MGB) technology for the diagnosis of EHV1. This technique does not require post-PCR manipulations thereby reducing the risk of cross-contamination. Most importantly, the technique is specific; it was able to differentiate EHV1 from the closely related member of the Alphaherpesvirinae, equid herpesvirus 4 (EHV4). It was not reactive with common opportunistic pathogens such as Escherichia coli, Klebsiella oxytoca, Pseudomonas aeruginosa and Enterobacter agglomerans often involved in abortion. Similarly, it did not react with equine pathogens such as Streptococcus equi, Streptococcus equisimilis, Streptococcus zooepidemicus, Taylorella equigenitalis and Rhodococcus equi, which also cause abortion. The results obtained with this technique agreed with results from published PCR methods. The assay was sensitive enough to detect EHV1 sequences in paraffin-embedded tissues and clinical samples. When compared to virus isolation, the test was more sensitive. This test will be useful for the routine diagnosis of EHV1 based on its specificity, sensitivity, ease of performance and rapidity.
Resumo:
The traditional reductionist approach to science has a tendency to create 'islands of knowledge in a sea of ignorance', with a much stronger focus on analysis of scientific inputs rather than synthesis of socially relevant outcomes. This might be the principal reason why intended end users of climate information generally fail to embrace what the climate science community has to offer. The translation of climate information into real-life action requires 3 essential components: salience (the perceived relevance of the information), credibility (the perceived technical quality of the information) and legitimacy (the perceived objectivity of the process by which the information is shared). We explore each of these components using 3 case studies focused on dryland cropping in Australia, India and Brazil. In regards to 'salience' we discuss the challenge for climate science to be 'policy-relevant', using Australian drought policy as an example. In a village in southern India 'credibility' was gained through engagement between scientists and risk managers with the aim of building social capital, achieved only at high cost to science institutions. Finally, in Brazil we found that 'legitimacy' is a fragile, yet renewable resource that needs to be part of the package for successful climate applications; legitimacy can be easily eroded but is difficult to recover. We conclude that climate risk management requires holistic solutions derived from cross-disciplinary and participatory, user-oriented research. Approaches that combine climate, agroecological and socioeconomic models provide the scientific capabilities for establishment of 'borderless' institutions without disciplinary constraints. Such institutions could provide the necessary support and flexibility to deliver the social benefits of climate science across diverse contexts. Our case studies show that this type of solution is already being applied, and suggest that the climate science community attempt to address existing institutional constraints, which still impede climate risk management.
Resumo:
Background Psychotic-like experiences (PLEs) are subclinical delusional ideas and perceptual disturbances that have been associated with a range of adverse mental health outcomes. This study reports a qualitative and quantitative analysis of the acceptability, usability and short term outcomes of Get Real, a web program for PLEs in young people. Methods Participants were twelve respondents to an online survey, who reported at least one PLE in the previous 3 months, and were currently distressed. Ratings of the program were collected after participants trialled it for a month. Individual semi-structured interviews then elicited qualitative feedback, which was analyzed using Consensual Qualitative Research (CQR) methodology. PLEs and distress were reassessed at 3 months post-baseline. Results User ratings supported the program's acceptability, usability and perceived utility. Significant reductions in the number, frequency and severity of PLE-related distress were found at 3 months follow-up. The CQR analysis identified four qualitative domains: initial and current understandings of PLEs, responses to the program, and context of its use. Initial understanding involved emotional reactions, avoidance or minimization, limited coping skills and non-psychotic attributions. After using the program, participants saw PLEs as normal and common, had greater self-awareness and understanding of stress, and reported increased capacity to cope and accept experiences. Positive responses to the program focused on its normalization of PLEs, usefulness of its strategies, self-monitoring of mood, and information putting PLEs into perspective. Some respondents wanted more specific and individualized information, thought the program would be more useful for other audiences, or doubted its effectiveness. The program was mostly used in low-stress situations. Conclusions The current study provided initial support for the acceptability, utility and positive short-term outcomes of Get Real. The program now requires efficacy testing in randomized controlled trials.
Resumo:
The use of near infrared (NIR) hyperspectral imaging and hyperspectral image analysis for distinguishing between hard, intermediate and soft maize kernels from inbred lines was evaluated. NIR hyperspectral images of two sets (12 and 24 kernels) of whole maize kernels were acquired using a Spectral Dimensions MatrixNIR camera with a spectral range of 960-1662 nm and a sisuChema SWIR (short wave infrared) hyperspectral pushbroom imaging system with a spectral range of 1000-2498 nm. Exploratory principal component analysis (PCA) was used on absorbance images to remove background, bad pixels and shading. On the cleaned images. PCA could be used effectively to find histological classes including glassy (hard) and floury (soft) endosperm. PCA illustrated a distinct difference between glassy and floury endosperm along principal component (PC) three on the MatrixNIR and PC two on the sisuChema with two distinguishable clusters. Subsequently partial least squares discriminant analysis (PLS-DA) was applied to build a classification model. The PLS-DA model from the MatrixNIR image (12 kernels) resulted in root mean square error of prediction (RMSEP) value of 0.18. This was repeated on the MatrixNIR image of the 24 kernels which resulted in RMSEP of 0.18. The sisuChema image yielded RMSEP value of 0.29. The reproducible results obtained with the different data sets indicate that the method proposed in this paper has a real potential for future classification uses.
Resumo:
BACKGROUND OR CONTEXT Thermodynamics is a core concept for mechanical engineers yet notoriously difficult. Evidence suggests students struggle to understand and apply the core fundamental concepts of thermodynamics with analysis indicating a problem with student learning/engagement. A contributing factor is that thermodynamics is a ‘science involving concepts based on experiments’ (Mayhew 1990) with subject matter that cannot be completely defined a priori. To succeed, students must engage in a deep-holistic approach while taking ownership of their learning. The difficulty in achieving this often manifests itself in students ‘not getting’ the principles and declaring thermodynamics ‘hard’. PURPOSE OR GOAL Traditionally, students practice and “learn” the application of thermodynamics in their tutorials, however these do not consider prior conceptions (Holman & Pilling 2004). As ‘hands on’ learning is the desired outcome of tutorials it is pertinent to study methods of improving their efficacy. Within the Australian context, the format of thermodynamics tutorials has remained relatively unchanged over the decades, relying anecdotally on a primarily didactic pedagogical approach. Such approaches are not conducive to deep learning (Ramsden 2003) with students often disengaged from the learning process. Evidence suggests (Haglund & Jeppsson 2012), however, that a deeper level and ownership of learning can be achieved using a more constructivist approach for example through self generated analogies. This pilot study aimed to collect data to support the hypothesis that the ‘difficulty’ of thermodynamics is associated with the pedagogical approach of tutorials rather than actual difficulty in subject content or deficiency in students. APPROACH Successful application of thermodynamic principles requires solid knowledge of the core concepts. Typically, tutorial sessions guide students in this application. However, a lack of deep and comprehensive understanding can lead to student confusion in the applications resulting in the learning of the ‘process’ of application without understanding ‘why’. The aim of this study was to gain empirical data on student learning of both concepts and application, within thermodynamic tutorials. The approach taken for data collection and analysis was: - 1 Four concurrent tutorial streams were timetabled to examine student engagement/learning in traditional ‘didactic’ (3 weeks) and non-traditional (3 weeks). In each week, two of the selected four sessions were traditional and two non-traditional. This provided a control group for each week. - 2 The non-traditional tutorials involved activities designed to promote student-centered deep learning. Specific pedagogies employed were: self-generated analogies, constructivist, peer-to-peer learning, inquiry based learning, ownership of learning and active learning. - 3 After a three-week period, teaching styles of the selected groups was switched, to allow each group to experience both approaches with the same tutor. This also acted to mimimise any influence of tutor personality / style on the data. - 4 At the conclusion of the trial participants completed a ‘5 minute essay’ on how they liked the sessions, a small questionnaire, modelled on the modified (Christo & Hoang, 2013)SPQ designed by Biggs (1987) and a small formative quiz to gauge the level of learning achieved. DISCUSSION Preliminary results indicate that overall students respond positively to in class demonstrations (inquiry based learning), and active learning activities. Within the active learning exercises, the current data suggests students preferred individual rather than group or peer-to-peer activities. Preliminary results from the open-ended questions such as “What did you like most/least about this tutorial” and “do you have other comments on how this tutorial could better facilitate your learning”, however, indicated polarising views on the nontraditional tutorial. Some student’s responded that they really like the format and emphasis on understanding the concepts, while others were very vocal that that ‘hated’ the style and just wanted the solutions to be presented by the tutor. RECOMMENDATIONS/IMPLICATIONS/CONCLUSION Preliminary results indicated a mixed, but overall positive response by students with more collaborative tutorials employing tasks promoting inquiry based, peer-to-peer, active, and ownership of learning activities. Preliminary results from student feedback supports evidence that students learn differently, and running tutorials focusing on only one pedagogical approached (typically didactic) may not be beneficial to all students. Further, preliminary data suggests that the learning / teaching style of both students and tutor are important to promoting deep learning in students. Data collection is still ongoing and scheduled for completion at the end of First Semester (Australian academic calendar). The final paper will examine in more detail the results and analysis of this project.