739 resultados para page layout analysis
Resumo:
Pediatric oncology has emerged as one of the great medical success stories of the last 4 decades. The cure rate of childhood cancer has increased from approximately 25% in the 1960’s to more than 75% in more recent years. However, very little is known about how children actually experience the diagnosis and treatment of their illness. A total of 9 families in which a child was diagnosed with cancer were interviewed twice over a 12-month period. Using the qualitative methodology of interpretative phenomenological analysis (IPA), children’s experiences of being patients with a diagnosis of cancer were explicated. The results revealed 5 significant themes: the experience of illness, the upside of being sick, refocusing on what is important, acquiring a new perspective, and the experience of returning to wellbeing. Changes over time were noted because children’s experiences’ were often pertinent to the stage of treatment the child had reached. These results revealed rich and intimate information about a sensitive issue with implications for understanding child development and medical and psychosocial treatment.
Resumo:
Details of a project which fictionalises the oral history of the life of the author's polio-afflicted grandmother Beth Bevan and her experiences at a home for children with disabilities are presented. The speech and language patterns recognised in the first person narration are described, as also the sense of voice and identity communicated through the oral history.
Resumo:
User-Web interactions have emerged as an important area of research in the field of information science. In this study, we investigate the effects of users’ cognitive styles on their Web navigational styles and information processing strategies. We report results from the analyses of 594 minutes recorded Web search sessions of 18 participants engaged in 54 scenario-based search tasks. We use questionnaires, cognitive style test, Web session logs and think-aloud as the data collection instruments. We classify users’ cognitive styles as verbalisers and imagers based on Riding’s (1991) Cognitive Style Analysis test. Two classifications of navigational styles and three categories of information processing strategies are identified. Our study findings show that there exist relationships between users’ cognitive style, and their navigational styles and information processing strategies. Verbal users seem to display sporadic navigational styles, and adopt a scanning strategy to understand the content of the search result page, while imagery users follow a structured navigational style and reading approach. We develop a matrix and a model that depicts the relationships between users’ cognitive styles, and their navigational style and information processing strategies. We discuss how the findings from this study could help search engine designers to provide an adaptive navigation support to users.
Resumo:
Myosin is believed to act as the molecular motor for many actin-based motility processes in eukaryotes. It is becoming apparent that a single species may possess multiple myosin isoforms, and at least seven distinct classes of myosin have been identified from studies of animals, fungi, and protozoans. The complexity of the myosin heavy-chain gene family in higher plants was investigated by isolating and characterizing myosin genomic and cDNA clones from Arabidopsis thaliana. Six myosin-like genes were identified from three polymerase chain reaction (PCR) products (PCR1, PCR11, PCR43) and three cDNA clones (ATM2, MYA2, MYA3). Sequence comparisons of the deduced head domains suggest that these myosins are members of two major classes. Analysis of the overall structure of the ATM2 and MYA2 myosins shows that they are similar to the previously-identified ATM1 and MYA1 myosins, respectively. The MYA3 appears to possess a novel tail domain, with five IQ repeats, a six-member imperfect repeat, and a segment of unique sequence. Northern blot analyses indicate that some of the Arabidopsis myosin genes are preferentially expressed in different plant organs. Combined with previous studies, these results show that the Arabidopsis genome contains at least eight myosin-like genes representing two distinct classes.
Resumo:
Purpose - The purpose of this paper is to introduce a knowledge-based urban development assessment framework, which has been constructed in order to evaluate and assist in the (re)formulation of local and regional policy frameworks and applications necessary in knowledge city transformations. Design/methodology/approach - The research reported in this paper follows a methodological approach that includes a thorough review of the literature, development of an assessment framework in order to inform policy-making by accurately evaluating knowledge-based development levels of cities, and application of this framework in a comparative study - Boston, Vancouver, Melbourne and Manchester. Originality/value - The paper, with its assessment framework, demonstrates an innovative way of examining the knowledge-based development capacity of cities by scrutinising their economic, socio-cultural, enviro-urban and institutional development mechanisms and capabilities. Practical implications - The paper introduces a framework developed to assess the knowledge-based development levels of cities; presents some of the generic indicators used to evaluate knowledge-based development performance of cities; demonstrates how a city can benchmark its development level against that of other cities, and; provides insights for achieving a more sustainable and knowledge-based development.
Resumo:
The theory of nonlinear dyamic systems provides some new methods to handle complex systems. Chaos theory offers new concepts, algorithms and methods for processing, enhancing and analyzing the measured signals. In recent years, researchers are applying the concepts from this theory to bio-signal analysis. In this work, the complex dynamics of the bio-signals such as electrocardiogram (ECG) and electroencephalogram (EEG) are analyzed using the tools of nonlinear systems theory. In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The Electrocardiogram (ECG) is an important biosignal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computerbased intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and four classes of arrhythmia. This thesis presents some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. Several features were extracted from the HOS and subjected an Analysis of Variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, seven features were extracted from the heart rate signals using HOS and fed to a support vector machine (SVM) for classification. The performance evaluation protocol in this thesis uses 330 subjects consisting of five different kinds of cardiac disease conditions. The classifier achieved a sensitivity of 90% and a specificity of 89%. This system is ready to run on larger data sets. In EEG analysis, the search for hidden information for identification of seizures has a long history. Epilepsy is a pathological condition characterized by spontaneous and unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic early detection of the seizure onsets would help the patients and observers to take appropriate precautions. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, these features are used to train both a Gaussian mixture model (GMM) classifier and a Support Vector Machine (SVM) classifier. Results show that the classifiers were able to achieve 93.11% and 92.67% classification accuracy, respectively, with selected HOS based features. About 2 hours of EEG recordings from 10 patients were used in this study. This thesis introduces unique bispectrum and bicoherence plots for various cardiac conditions and for normal, background and epileptic EEG signals. These plots reveal distinct patterns. The patterns are useful for visual interpretation by those without a deep understanding of spectral analysis such as medical practitioners. It includes original contributions in extracting features from HRV and EEG signals using HOS and entropy, in analyzing the statistical properties of such features on real data and in automated classification using these features with GMM and SVM classifiers.
Resumo:
The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.
Resumo:
Acoustic emission (AE) is the phenomenon where high frequency stress waves are generated by rapid release of energy within a material by sources such as crack initiation or growth. AE technique involves recording these stress waves by means of sensors placed on the surface and subsequent analysis of the recorded signals to gather information such as the nature and location of the source. It is one of the several diagnostic techniques currently used for structural health monitoring (SHM) of civil infrastructure such as bridges. Some of its advantages include ability to provide continuous in-situ monitoring and high sensitivity to crack activity. But several challenges still exist. Due to high sampling rate required for data capture, large amount of data is generated during AE testing. This is further complicated by the presence of a number of spurious sources that can produce AE signals which can then mask desired signals. Hence, an effective data analysis strategy is needed to achieve source discrimination. This also becomes important for long term monitoring applications in order to avoid massive date overload. Analysis of frequency contents of recorded AE signals together with the use of pattern recognition algorithms are some of the advanced and promising data analysis approaches for source discrimination. This paper explores the use of various signal processing tools for analysis of experimental data, with an overall aim of finding an improved method for source identification and discrimination, with particular focus on monitoring of steel bridges.
Resumo:
Background: Efforts to prevent the development of overweight and obesity have increasingly focused early in the life course as we recognise that both metabolic and behavioural patterns are often established within the first few years of life. Randomised controlled trials (RCTs) of interventions are even more powerful when, with forethought, they are synthesised into an individual patient data (IPD) prospective meta-analysis (PMA). An IPD PMA is a unique research design where several trials are identified for inclusion in an analysis before any of the individual trial results become known and the data are provided for each randomised patient. This methodology minimises the publication and selection bias often associated with a retrospective meta-analysis by allowing hypotheses, analysis methods and selection criteria to be specified a priori. Methods/Design: The Early Prevention of Obesity in CHildren (EPOCH) Collaboration was formed in 2009. The main objective of the EPOCH Collaboration is to determine if early intervention for childhood obesity impacts on body mass index (BMI) z scores at age 18-24 months. Additional research questions will focus on whether early intervention has an impact on children’s dietary quality, TV viewing time, duration of breastfeeding and parenting styles. This protocol includes the hypotheses, inclusion criteria and outcome measures to be used in the IPD PMA. The sample size of the combined dataset at final outcome assessment (approximately 1800 infants) will allow greater precision when exploring differences in the effect of early intervention with respect to pre-specified participant- and intervention-level characteristics. Discussion: Finalisation of the data collection procedures and analysis plans will be complete by the end of 2010. Data collection and analysis will occur during 2011-2012 and results should be available by 2013. Trial registration number: ACTRN12610000789066
Resumo:
This analysis of housing experiences and aspirations in three remote Indigenous settlements in Australia (Mimili, Maningrida and Palm Island) reveals extreme liveability problems directly related to the scale and form of housing provision. Based upon field visits to each of the settlements and extensive interviews with residents and local housing and community officers, the paper analyses two aspects of living in such housing conditions at two spatial scales, the layout of the settlement and the design of individual houses. The failings at both scales are shown to be the fault of a dysfunctional housing system that is only recently been addressed.
Resumo:
Aims--Telemonitoring (TM) and structured telephone support (STS) have the potential to deliver specialised management to more patients with chronic heart failure (CHF), but their efficacy is still to be proven. Objectives To review randomised controlled trials (RCTs) of TM or STS on all- cause mortality and all-cause and CHF-related hospitalisations in patients with CHF, as a non-invasive remote model of specialised disease-management intervention.--Methods and Results--Data sources:We searched 15 electronic databases and hand-searched bibliographies of relevant studies, systematic reviews, and meeting abstracts. Two reviewers independently extracted all data. Study eligibility and participants: We included any randomised controlled trials (RCT) comparing TM or STS to usual care of patients with CHF. Studies that included intensified management with additional home or clinic visits were excluded. Synthesis: Primary outcomes (mortality and hospitalisations) were analysed; secondary outcomes (cost, length of stay, quality of life) were tabulated.--Results: Thirty RCTs of STS and TM were identified (25 peer-reviewed publications (n=8,323) and five abstracts (n=1,482)). Of the 25 peer-reviewed studies, 11 evaluated TM (2,710 participants), 16 evaluated STS (5,613 participants) and two tested both interventions. TM reduced all-cause mortality (risk ratio (RR 0•66 [95% CI 0•54-0•81], p<0•0001) and STS showed similar trends (RR 0•88 [95% CI 0•76-1•01], p=0•08). Both TM (RR 0•79 [95% CI 0•67-0•94], p=0•008) and STS (RR 0•77 [95% CI 0•68-0•87], p<0•0001) reduced CHF-related hospitalisations. Both interventions improved quality of life, reduced costs, and were acceptable to patients. Improvements in prescribing, patient-knowledge and self-care, and functional class were observed.--Conclusion: TM and STS both appear effective interventions to improve outcomes in patients with CHF.
Resumo:
The period from 2007 to 2009 covered the residential property boom from early 2000, to the property recession following the Global Financial Crisis. Since late 2008, a number of residential property markets have suffered significant falls in house prices, buth this has not been consistent across all market sectors. This paper will analyze the housing market in Brisbane Australia to determine the impact, similarities and differences that the4 GFC had on range of residential sectors across a divesified property market. Data analysis will provide an overview of residential property prices, sales and listing volumes over the study period and will provide a comparison of median house price performance across the geographic and socio-economic areas of Brisbane.
Resumo:
The Australian tourism tertiary education sector operates in a competitive and dynamic environment, which necessitates a market orientation to be successful. Academic staff and management in the sector must regularly assess the perceptions of prospective and current students and monitor the satisfaction levels of current students. This study is concerned with the setting and monitoring of satisfaction levels of current students, reporting the results of three longitudinal investigations of student satisfaction in a postgraduate unit. The study also addresses a limitation of a university’s generic teaching evaluation instrument. Importance-Performance Analysis (IPA) has been recommended as a simple but effective tool for overcoming the deficiencies of many student evaluation studies, which have generally measured only attribute performance at the end of a semester. IPA was used to compare student expectations of the unit at the beginning of a semester with their perceptions of performance 10 weeks later. The first stage documented key benchmarks for which amendments to the unit based on student feedback could be evaluated during subsequent teaching periods.
Resumo:
A concise introduction to the key ideas and issues in the study of media economics, drawing on a broad range of case studies - from Amazon and Twitter, to Apple and Netflix - to illustrate how economic paradigms are not just theories, but provide important practical insights into how the media operates today. Understanding the economic paradigms at work in media industries and markets is vitally important for the analysis of the media system as a whole. The changing dynamics of media production, distribution and consumption are stretching the capacity of established economic paradigms. In addition to succinct accounts of neo-classical and critical political economics, the text offers fresh perspectives for understanding media drawn from two 'heterodox' approaches: institutional economics and evolutionary economics. Applying these paradigms to vital topics and case studies, Media Economics stresses the value – and limits – of contending economic approaches in understanding how the media operates today. It is essential reading for all students of Media and Communication Studies, and also those from Economics, Policy Studies, Business Studies and Marketing backgrounds who are studying the media. Table of Contents: 1. Media Economics: The Mainstream Approach 2. Critical Political Economy of the Media 3. Institutional Economics 4. Evolutionary Economics 5. Case Studies and Conclusions
Resumo:
This study aims to examine the impact of socio-ecologic factors on the transmission of Ross River virus (RRV) infection and to identify areas prone to social and ecologic-driven epidemics in Queensland, Australia. We used a Bayesian spatiotemporal conditional autoregressive model to quantify the relationship between monthly variation of RRV incidence and socio-ecologic factors and to determine spatiotemporal patterns. Our results show that the average increase in monthly RRV incidence was 2.4% (95% credible interval (CrI): 0.1–4.5%) and 2.0% (95% CrI: 1.6–2.3%) for a 1°C increase in monthly average maximum temperature and a 10 mm increase in monthly average rainfall, respectively. A significant spatiotemporal variation and interactive effect between temperature and rainfall on RRV incidence were found. No association between Socio-economic Index for Areas (SEIFA) and RRV was observed. The transmission of RRV in Queensland, Australia appeared to be primarily driven by ecologic variables rather than social factors.