978 resultados para STIFFLY-STABLE METHODS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many industrial processes and systems can be modelled mathematically by a set of Partial Differential Equations (PDEs). Finding a solution to such a PDF model is essential for system design, simulation, and process control purpose. However, major difficulties appear when solving PDEs with singularity. Traditional numerical methods, such as finite difference, finite element, and polynomial based orthogonal collocation, not only have limitations to fully capture the process dynamics but also demand enormous computation power due to the large number of elements or mesh points for accommodation of sharp variations. To tackle this challenging problem, wavelet based approaches and high resolution methods have been recently developed with successful applications to a fixedbed adsorption column model. Our investigation has shown that recent advances in wavelet based approaches and high resolution methods have the potential to be adopted for solving more complicated dynamic system models. This chapter will highlight the successful applications of these new methods in solving complex models of simulated-moving-bed (SMB) chromatographic processes. A SMB process is a distributed parameter system and can be mathematically described by a set of partial/ordinary differential equations and algebraic equations. These equations are highly coupled; experience wave propagations with steep front, and require significant numerical effort to solve. To demonstrate the numerical computing power of the wavelet based approaches and high resolution methods, a single column chromatographic process modelled by a Transport-Dispersive-Equilibrium linear model is investigated first. Numerical solutions from the upwind-1 finite difference, wavelet-collocation, and high resolution methods are evaluated by quantitative comparisons with the analytical solution for a range of Peclet numbers. After that, the advantages of the wavelet based approaches and high resolution methods are further demonstrated through applications to a dynamic SMB model for an enantiomers separation process. This research has revealed that for a PDE system with a low Peclet number, all existing numerical methods work well, but the upwind finite difference method consumes the most time for the same degree of accuracy of the numerical solution. The high resolution method provides an accurate numerical solution for a PDE system with a medium Peclet number. The wavelet collocation method is capable of catching up steep changes in the solution, and thus can be used for solving PDE models with high singularity. For the complex SMB system models under consideration, both the wavelet based approaches and high resolution methods are good candidates in terms of computation demand and prediction accuracy on the steep front. The high resolution methods have shown better stability in achieving steady state in the specific case studied in this Chapter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The processes of digitization and deregulation have transformed the production, distribution and consumption of information and entertainment media over the past three decades. Today, researchers are confronted with profoundly different landscapes of domestic and personal media than the pioneers of qualitative audience research that came to form much of the conceptual basis of Cultural Studies first in Britain and North America and subsequently across all global regions. The process of media convergence, as a consequence of the dual forces of digitisation and deregulation, thus constitutes a central concept in the analysis of popular mass media. From the study of the internationalisation and globalisation of media content, changing regimes of media production, via the social shaping and communication technologies and conversely the impact of communication technology on social, cultural and political realities, to the emergence of transmedia storytelling, the interplay of intertextuality and genre and the formation of mediated social networks, convergence informs and shapes contemporary conceptual debates in the field of popular communication and beyond. However, media convergence challenges not only the conceptual canon of (popular) communication research, but poses profound methodological challenges. As boundaries between producers and consumers are increasingly fluent, formerly stable fields and categories of research such as industries, texts and audiences intersect and overlap, requiring combined and new research strategies. This preconference aims to offer a forum to present and discuss methodological innovations in the study of contemporary media and the analysis of the social, cultural,and political impact and challenges arising through media convergence. The preconference thus aims to focus on the following methodological questions and challenges: *New strategies of audience research responding to the increasing individualisation of popular media consumption. *Methods of data triangulation in and through the integrated study of media production, distribution and consumption. *Bridging the methodological and often associated conceptual gap between qualitative and quantitative research in the study of popular media. *The future of ethnographic audience and production research in light of blurring boundaries between media producers and consumers. *A critical re-examination of which textual configurations can be meaningfully described and studied as text. *Methodological innovations aimed at assessing the macro social, cultural and political impact of mediatization (including, but not limited to, "creative methods"). *Methodological responses to the globalisation of popular media and practicalities of international and transnational comparative research. *An exploration of new methods required in the study of media flow and intertextuality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of bubble contraction in a Hele-Shaw cell is studied for the case in which the surrounding fluid is of power-law type. A small perturbation of the radially symmetric problem is first considered, focussing on the behaviour just before the bubble vanishes, it being found that for shear-thinning fluids the radially symmetric solution is stable, while for shear-thickening fluids the aspect ratio of the bubble boundary increases. The borderline (Newtonian) case considered previously is neutrally stable, the bubble boundary becoming elliptic in shape with the eccentricity of the ellipse depending on the initial data. Further light is shed on the bubble contraction problem by considering a long thin Hele-Shaw cell: for early times the leading-order behaviour is one-dimensional in this limit; however, as the bubble contracts its evolution is ultimately determined by the solution of a Wiener-Hopf problem, the transition between the long-thin limit and the extinction limit in which the bubble vanishes being described by what is in effect a similarity solution of the second kind. This same solution describes the generic (slit-like) extinction behaviour for shear-thickening fluids, the interface profiles that generalise the ellipses that characterise the Newtonian case being constructed by the Wiener-Hopf calculation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The psychological contract has emerged over the past 60 years as a key analytical device for both academics and practitioners to conceptualise and explain the employment relationship. However, despite the recognised import of this field, some authors suggest it has fallen into a ‘methodological rut’ and is neglecting to empirically assess basic theoretical tenets of the concept – such as the temporal and individualised, subjective nature of the construct. This paper describes the research design of a longitudinal, mixed methods study to explore development and change in the psychological contract and outline how the use of individual growth modelling can be a powerful tool in analysing the type of quantitative data collected. Finally, by briefly outlining the benefits of this approach, the paper seeks to offer an alternative methodology to explore the dynamic and intra-individual processes within the psychological contract domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seven endemic governance problems are shown to be currently present in governments around the globe and at any level of government as well (for example municipal, federal). These problems have their roots traced back through more than two thousand years of political, specifically ‘democratic’, history. The evidence shows that accountability, transparency, corruption, representation, campaigning methods, constitutionalism and long-term goals were problematic for the ancient Athenians as well as modern international democratisation efforts encompassing every major global region. Why then, given the extended time period humans have had to deal with these problems, are they still present? At least part of the answer to this question is that philosophers, academics and NGOs as well as MNOs have only approached these endemic problems in a piecemeal manner with a skewed perspective on democracy. Their works have also been subject to the ebbs and flows of human history which essentially started and stopped periods of thinking. In order to approach the investigation of endemic problems in relation to democracy (as the overall quest of this thesis was to generate prescriptive results for the improvement of democratic government), it was necessary to delineate what exactly is being written about when using the term ‘democracy’. It is common knowledge that democracy has no one specific definition or practice, even though scholars and philosophers have been attempting to create a definition for generations. What is currently evident, is that scholars are not approaching democracy in an overly simplified manner (that is, it is government for the people, by the people) but, rather, are seeking the commonalities that democracies share, in other words, those items which are common to all things democratic. Following that specific line of investigation, the major practiced and theoretical versions of democracy were thematically analysed. After that, their themes were collapsed into larger categories, at which point the larger categories were comparatively analysed with the practiced and theoretical versions of democracy. Four democratic ‘particles’ (selecting officials, law, equality and communication) were seen to be present in all practiced and theoretical democratic styles. The democratic particles fused with a unique investigative perspective and in-depth political study created a solid conceptualisation of democracy. As such, it is argued that democracy is an ever-present element of any state government, ‘democratic’ or not, and the particles are the bodies which comprise the democratic element. Frequency- and proximity-based analyses showed that democratic particles are related to endemic problems in international democratisation discourse. The linkages between democratic particles and endemic problems were also evident during the thematic analysis as well historical review. This ultimately led to the viewpoint that if endemic problems are mitigated the act may improve democratic particles which might strengthen the element of democracy in the governing apparatus of any state. Such may actively minimise or wholly displace inefficient forms of government, leading to a government specifically tailored to the population it orders. Once the theoretical and empirical goals were attained, this thesis provided some prescriptive measures which government, civil society, academics, professionals and/or active citizens can use to mitigate endemic problems (in any country and at any level of government) so as to improve the human condition via better democratic government.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Uterine Papillary Serous Carcinoma (UPSC) is uncommon and accounts for less than 5% of all uterine cancers. Therefore the majority of evidence about the benefits of adjuvant treatment comes from retrospective case series. We conducted a prospective multi-centre non-randomized phase 2 clinical trial using four cycles of adjuvant paclitaxel plus carboplatin chemotherapy followed by pelvic radiotherapy, in order to evaluate the tolerability and safety of this approach. Methods This trial enrolled patients with newly diagnosed, previously untreated patients with stage 1b-4 (FIGO-1988) UPSC with a papillary serous component of at least 30%. Paclitaxel (175 mg/m2) and carboplatin (AUC 6) were administered on day 1 of each 3-week cycle for 4 cycles. Chemotherapy was followed by external beam radiotherapy to the whole pelvis (50.4 Gy over 5.5 weeks). Completion and toxicity of treatment (Common Toxicity Criteria, CTC) and quality of life measures were the primary outcome indicators. Results Twenty-nine of 31 patients completed treatment as planned. Dose reduction was needed in 9 patients (29%), treatment delay in 7 (23%), and treatment cessation in 2 patients (6.5%). Hematologic toxicity, grade 3 or 4 occurred in 19% (6/31) of patients. Patients' self-reported quality of life remained stable throughout treatment. Thirteen of the 29 patients with stages 1–3 disease (44.8%) recurred (average follow up 28.1 months, range 8–60 months). Conclusion This multimodal treatment is feasible, safe and tolerated reasonably well and would be suitable for use in multi-institutional prospective randomized clinical trials incorporating novel therapies in patients with UPSC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tungro is one of the most destructive viral diseases of rice in South and Southeast Asia. It is associated with two viruses---rice tungro bacilliform virus (RTBV) ,and rice tungro spherical virus (RTSV) (Hibino et al 1978). Both viruses are transmitted by the green leafhopper (GLH) Nephotettix virescens (Ling 1979), However, prior acquisition of RTSV is required for Ihe transmission of RTBV alone (Hibino 1983). Plants infected with both viruses show severe stunting and yellowing. Those infected with RTBV alone show mild stunting but no leaf discoloration whereas those infected with RTSV alone do not show any apparent symptoms (Hibino el al 1978). Since the late 1960s, tungro has been mainly managed through varietal resistance (Khush 1989). The instability of resistant varieties in the field (Dahal et .a1 1990) led to a reexamination of the nature of the incorporated sources of resistance and to the adoption of more precise and more accurate screening methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, an enriched radial point interpolation method (e-RPIM) is developed the for the determination of crack tip fields. In e-RPIM, the conventional RBF interpolation is novelly augmented by the suitable trigonometric basis functions to reflect the properties of stresses for the crack tip fields. The performance of the enriched RBF meshfree shape functions is firstly investigated to fit different surfaces. The surface fitting results have proven that, comparing with the conventional RBF shape function, the enriched RBF shape function has: (1) a similar accuracy to fit a polynomial surface; (2) a much better accuracy to fit a trigonometric surface; and (3) a similar interpolation stability without increase of the condition number of the RBF interpolation matrix. Therefore, it has proven that the enriched RBF shape function will not only possess all advantages of the conventional RBF shape function, but also can accurately reflect the properties of stresses for the crack tip fields. The system of equations for the crack analysis is then derived based on the enriched RBF meshfree shape function and the meshfree weak-form. Several problems of linear fracture mechanics are simulated using this newlydeveloped e-RPIM method. It has demonstrated that the present e-RPIM is very accurate and stable, and it has a good potential to develop a practical simulation tool for fracture mechanics problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to aid researchers in selecting appropriate qualitative methods in order to develop and improve future studies in the field of emotional design. These include observations, think-aloud protocols, questionnaires, diaries and interviews. Based on the authors’ experiences, it is proposed that the methods under review can be successfully used for collecting data on emotional responses to evaluate user product relationships. This paper reviews the methods; discusses the suitability, advantages and challenges in relation to design and emotion studies. Furthermore, the paper outlines the potential impact of technology on the application of these methods, discusses the implications of these methods for emotion research and concludes with recommendations for future work in this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Assessments of change in subjective patient reported outcomes such as health-related quality of life (HRQoL) are a key component of many clinical and research evaluations. However, conventional longitudinal evaluation of change may not agree with patient perceived change if patients' understanding of the subjective construct under evaluation changes over time (response shift) or if patients' have inaccurate recollection (recall bias). This study examined whether older adults' perception of change is in agreement with conventional longitudinal evaluation of change in their HRQoL over the duration of their hospital stay. It also investigated this level of agreement after adjusting patient perceived change for recall bias that patients may have experienced. Methods: A prospective longitudinal cohort design nested within a larger randomised controlled trial was implemented. 103 hospitalised older adults participated in this investigation at a tertiary hospital facility. The EQ-5D utility and Visual Analogue Scale (VAS) scores were used to evaluate HRQoL. Participants completed EQ-5D reports as soon as they were medically stable (within three days of admission) then again immediately prior to discharge. Three methods of change score calculation were used (conventional change, patient perceived change and patient perceived change adjusted for recall bias). Agreement was primarily investigated using intraclass correlation coefficients (ICC) and limits of agreement. Results: Overall 101 (98%) participants completed both admission and discharge assessments. The mean (SD) age was 73.3 (11.2). The median (IQR) length of stay was 38 (20-60) days. For agreement between conventional longitudinal change and patient perceived change: ICCs were 0.34 and 0.40 for EQ-5D utility and VAS respectively. For agreement between conventional longitudinal change and patient perceived change adjusted for recall bias: ICCs were 0.98 and 0.90 respectively. Discrepancy between conventional longitudinal change and patient perceived change was considered clinically meaningful for 84 (83.2%) of participants, after adjusting for recall bias this reduced to 8 (7.9%). Conclusions: Agreement between conventional change and patient perceived change was not strong. A large proportion of this disagreement could be attributed to recall bias. To overcome the invalidating effect of response shift (on conventional change) and recall bias (on patient perceived change) a method of adjusting patient perceived change for recall bias has been described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To compare three different methods of falls reporting and examine the characteristics of the data missing from the hospital incident reporting system. DESIGN: Fourteen-month prospective observational study nested within a randomized controlled trial. SETTING: Rehabilitation, stroke, medical, surgical, and orthopedic wards in Perth and Brisbane, Australia. PARTICIPANTS: Fallers (n5153) who were part of a larger trial (1,206 participants, mean age 75.1 � 11.0). MEASUREMENTS: Three falls events reporting measures: participants’ self-report of fall events, fall events reported in participants’ case notes, and falls events reported through the hospital reporting systems. RESULTS: The three reporting systems identified 245 falls events in total. Participants’ case notes captured 226 (92.2%) falls events, hospital incident reporting systems captured 185 (75.5%) falls events, and participant selfreport captured 147 (60.2%) falls events. Falls events were significantly less likely to be recorded in hospital reporting systems when a participant sustained a subsequent fall, (P5.01) or when the fall occurred in the morning shift (P5.01) or afternoon shift (P5.01). CONCLUSION: Falls data missing from hospital incident report systems are not missing completely at random and therefore will introduce bias in some analyses if the factor investigated is related to whether the data ismissing.Multimodal approaches to collecting falls data are preferable to relying on a single source alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with some plane strain and axially symmetric free surface problems which arise in the study of static granular solids that satisfy the Coulomb-Mohr yield condition. Such problems are inherently nonlinear, and hence difficult to attack analytically. Given a Coulomb friction condition holds on a solid boundary, it is shown that the angle a free surface is allowed to attach to the boundary is dependent only on the angle of wall friction, assuming the stresses are all continuous at the attachment point, and assuming also that the coefficient of cohesion is nonzero. As a model problem, the formation of stable cohesive arches in hoppers is considered. This undesirable phenomena is an obstacle to flow, and occurs when the hopper outlet is too small. Typically, engineers are concerned with predicting the critical outlet size for a given hopper and granular solid, so that for hoppers with outlets larger than this critical value, arching cannot occur. This is a topic of considerable practical interest, with most accepted engineering methods being conservative in nature. Here, the governing equations in two limiting cases (small cohesion and high angle of internal friction) are considered directly. No information on the critical outlet size is found; however solutions for the shape of the free boundary (the arch) are presented, for both plane and axially symmetric geometries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: International data on child maltreatment are largely derived from child protection agencies, and predominantly report only substantiated cases of child maltreatment. This approach underestimates the incidence of maltreatment and makes inter-jurisdictional comparisons difficult. There has been a growing recognition of the importance of health professionals in identifying, documenting and reporting suspected child maltreatment. This study aimed to describe the issues around case identification using coded morbidity data, outline methods for selecting and grouping relevant codes, and illustrate patterns of maltreatment identified. Methods: A comprehensive review of the ICD-10-AM classification system was undertaken, including review of index terms, a free text search of tabular volumes, and a review of coding standards pertaining to child maltreatment coding. Identified codes were further categorised into maltreatment types including physical abuse, sexual abuse, emotional or psychological abuse, and neglect. Using these code groupings, one year of Australian hospitalisation data for children under 18 years of age was examined to quantify the proportion of patients identified and to explore the characteristics of cases assigned maltreatment-related codes. Results: Less than 0.5% of children hospitalised in Australia between 2005 and 2006 had a maltreatment code assigned, almost 4% of children with a principal diagnosis of a mental and behavioural disorder and over 1% of children with an injury or poisoning as the principal diagnosis had a maltreatment code assigned. The patterns of children assigned with definitive T74 codes varied by sex and age group. For males selected as having a maltreatment-related presentation, physical abuse was most commonly coded (62.6% of maltreatment cases) while for females selected as having a maltreatment-related presentation, sexual abuse was the most commonly assigned form of maltreatment (52.9% of maltreatment cases). Conclusion: This study has demonstrated that hospital data could provide valuable information for routine monitoring and surveillance of child maltreatment, even in the absence of population-based linked data sources. With national and international calls for a public health response to child maltreatment, better understanding of, investment in and utilisation of our core national routinely collected data sources will enhance the evidence-base needed to support an appropriate response to children at risk.