351 resultados para Fonctions lexicales non standard
Resumo:
Nontuberculous mycobacteria are ubiquitous environmental organisms that have been recognised as a cause of pulmonary infection for over 50 years. Traditionally patients have had underlying risk factors for development of disease; however the proportion of apparently immunocompetent patients involved appears to be rising. Not all patients culture-positive for mycobacteria will have progressive disease, making the diagnosis difficult, though criteria to aid in this process are available. The two main forms of disease are cavitary disease (usually involving the upper lobes) and fibronodular bronchiectasis (predominantly middle and lingular lobes). For patients with disease, combination antibiotic therapy for 12-24 months is generally required for successful treatment, and this may be accompanied by drug intolerances and side effects. Published success rates range from 30-82%. As the progression of disease is variable, for some patients, attention to pulmonary hygiene and underlying diseases without immediate antimycobacterial therapy may be more appropriate. Surgery can be a useful adjunct, though is associated with risks. Randomised controlled trials in well described patients would provide stronger evidence-based data to guide therapy of NTM lung diseases, and thus are much needed.
Resumo:
The move to a market model of schooling has seen a radical restructuring of the ways schooling is “done” in recent times in Western countries. Although there has been a great deal of work to examine the effects of a market model on local school management (LSM), teachers’ work and university systems, relatively little has been done to examine its effect on parents’ choice of school in the non-government sector in Australia. This study examines the reasons parents give for choosing a non-government school in the outer suburbs of one large city in Australia. Drawing on the work of Bourdieu specifically his ideas on “cultural capital” (1977), this study revealed that parents were choosing the non-government school over the government school to ensure that their children would be provided, through the school’s emphasis on cultural capital, access to a perceived “better life” thus enhancing the potential to facilitate “extraordinary children”, one of the school’s marketing claims.
Resumo:
While it is uncontested that the medical profession makes a valuable contribution to society, doctors should not always be beyond the reach of the criminal law and they should not automatically be treated as God. Doctors should act reasonably and be conscious of their position of trust. In this sense, the notion of “doctors” is construed broadly to include a range of health care professionals such as podiatrists, radiographers, surgeons and general practitioners. This paper will explore contemporary Australian examples where doctors have acted inappropriately and been convicted of non-fatal offences against the person. The physical invasiveness involved in these scenarios varies significantly. In one example, a doctor penetrates a patient’s private body part with a probe for their own sexual gratification, and in another, a doctor covertly visually records a naked patient. The examples will be connected to the theories underpinning criminalisation, particularly social welfare and individual autonomy, with a view to framing guidelines on when doctors should not be immune from non-fatal offences against a person, and thus where the criminal law should respond.
Resumo:
This paper proposes a method of enhancing system stability with a distribution static compensator (DSTATCOM) in an autonomous microgrid with multiple distributed generators (DG). It is assumed that there are both inertial and non-inertial DGs connected to the microgrid. The inertial DG can be a synchronous machine of smaller rating while inertia less DGs (solar) are assumed as DC sources. The inertia less DGs are connected through Voltage Source Converter (VSC) to the microgrid. The VSCs are controlled by either state feedback or current feedback mode to achieve desired voltage-current or power outputs respectively. The power sharing among the DGs is achieved by drooping voltage angle. Once the reference for the output voltage magnitude and angle is calculated from the droop, state feedback controllers are used to track the reference. The angle reference for the synchronous machine is compared with the output voltage angle of the machine and the error is fed to a PI controller. The controller output is used to set the power reference of the synchronous machine. The rate of change in the angle in a synchronous machine is restricted by the machine inertia and to mimic this nature, the rate of change in the VSCs angles are restricted by a derivative feedback in the droop control. The connected distribution static compensator (DSTATCOM) provides ride through capability during power imbalance in the microgrid, especially when the stored energy of the inertial DG is not sufficient to maintain stability. The inclusion of the DSATCOM in such cases ensures the system stability. The efficacies of the controllers are established through extensive simulation studies using PSCAD.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Abstract—Corneal topography estimation that is based on the Placido disk principle relies on good quality of precorneal tear film and sufficiently wide eyelid (palpebral) aperture to avoid reflections from eyelashes. However, in practice, these conditions are not always fulfilled resulting in missing regions, smaller corneal coverage, and subsequently poorer estimates of corneal topography. Our aim was to enhance the standard operating range of a Placido disk videokeratoscope to obtain reliable corneal topography estimates in patients with poor tear film quality, such as encountered in those diagnosed with dry eye, and with narrower palpebral apertures as in the case of Asian subjects. This was achieved by incorporating in the instrument’s own topography estimation algorithm an image processing technique that comprises a polar-domain adaptive filter and amorphological closing operator. The experimental results from measurements of test surfaces and real corneas showed that the incorporation of the proposed technique results in better estimates of corneal topography, and, in many cases, to a significant increase in the estimated coverage area making such an enhanced videokeratoscope a better tool for clinicians.
Resumo:
Scalable video coding of H.264/AVC standard enables adaptive and flexible delivery for multiple devices and various network conditions. Only a few works have addressed the influence of different scalability parameters (frame rate, spatial resolution, and SNR) on the user perceived quality within a limited scope. In this paper, we have conducted an experiment of subjective quality assessment for video sequences encoded with H.264/SVC to gain a better understanding of the correlation between video content and UPQ at all scalable layers and the impact of rate-distortion method and different scalabilities on bitrate and UPQ. Findings from this experiment will contribute to a user-centered design of adaptive delivery of scalable video stream.
Resumo:
The increasing diversity of the Internet has created a vast number of multilingual resources on the Web. A huge number of these documents are written in various languages other than English. Consequently, the demand for searching in non-English languages is growing exponentially. It is desirable that a search engine can search for information over collections of documents in other languages. This research investigates the techniques for developing high-quality Chinese information retrieval systems. A distinctive feature of Chinese text is that a Chinese document is a sequence of Chinese characters with no space or boundary between Chinese words. This feature makes Chinese information retrieval more difficult since a retrieved document which contains the query term as a sequence of Chinese characters may not be really relevant to the query since the query term (as a sequence Chinese characters) may not be a valid Chinese word in that documents. On the other hand, a document that is actually relevant may not be retrieved because it does not contain the query sequence but contains other relevant words. In this research, we propose two approaches to deal with the problems. In the first approach, we propose a hybrid Chinese information retrieval model by incorporating word-based techniques with the traditional character-based techniques. The aim of this approach is to investigate the influence of Chinese segmentation on the performance of Chinese information retrieval. Two ranking methods are proposed to rank retrieved documents based on the relevancy to the query calculated by combining character-based ranking and word-based ranking. Our experimental results show that Chinese segmentation can improve the performance of Chinese information retrieval, but the improvement is not significant if it incorporates only Chinese segmentation with the traditional character-based approach. In the second approach, we propose a novel query expansion method which applies text mining techniques in order to find the most relevant words to extend the query. Unlike most existing query expansion methods, which generally select the highly frequent indexing terms from the retrieved documents to expand the query. In our approach, we utilize text mining techniques to find patterns from the retrieved documents that highly correlate with the query term and then use the relevant words in the patterns to expand the original query. This research project develops and implements a Chinese information retrieval system for evaluating the proposed approaches. There are two stages in the experiments. The first stage is to investigate if high accuracy segmentation can make an improvement to Chinese information retrieval. In the second stage, a text mining based query expansion approach is implemented and a further experiment has been done to compare its performance with the standard Rocchio approach with the proposed text mining based query expansion method. The NTCIR5 Chinese collections are used in the experiments. The experiment results show that by incorporating the text mining based query expansion with the hybrid model, significant improvement has been achieved in both precision and recall assessments.
Resumo:
We aimed to investigate the naturally occurring horizontal plane movements of a head stabilized in a standard ophthalmic headrest and to analyze their magnitude, velocity, spectral characteristics, and correlation to the cardio pulmonary system. Two custom-made air-coupled highly accurate (±2 μm)ultrasound transducers were used to measure the displacements of the head in different horizontal directions with a sampling frequency of 100 Hz. Synchronously to the head movements, an electrocardiogram (ECG) signal was recorded. Three healthy subjects participated in the study. Frequency analysis of the recorded head movements and their velocities was carried out, and functions of coherence between the two displacements and the ECG signal were calculated. Frequency of respiration and the heartbeat were clearly visible in all recorded head movements. The amplitude of head displacements was typically in the range of ±100 μm. The first harmonic of the heartbeat (in the range of 2–3 Hz), rather than its principal frequency, was found to be the dominant frequency of both head movements and their velocities. Coherence analysis showed high interdependence between the considered signals for frequencies of up to 20 Hz. These findings may contribute to the design of better ophthalmic headrests and should help other studies in the decision making of whether to use a heavy headrest or a bite bar.