222 resultados para uncertain polynomials


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: All currently considered parametric models used for decomposing videokeratoscopy height data are viewercentered and hence describe what the operator sees rather than what the surface is. The purpose of this study was to ascertain the applicability of an object-centered representation to modeling of corneal surfaces. Methods: A three-dimensional surface decomposition into a series of spherical harmonics is considered and compared with the traditional Zernike polynomial expansion for a range of videokeratoscopic height data. Results: Spherical harmonic decomposition led to significantly better fits to corneal surfaces (in terms of the root mean square error values) than the corresponding Zernike polynomial expansions with the same number of coefficients, for all considered corneal surfaces, corneal diameters, and model orders. Conclusions: Spherical harmonic decomposition is a viable alternative to Zernike polynomial decomposition. It achieves better fits to videokeratoscopic height data and has the advantage of an object-centered representation that could be particularly suited to the analysis of multiple corneal measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A common optometric problem is to specify the eye’s ocular aberrations in terms of Zernike coefficients and to reduce that specification to a prescription for the optimum sphero-cylindrical correcting lens. The typical approach is first to reconstruct wavefront phase errors from measurements of wavefront slopes obtained by a wavefront aberrometer. This paper applies a new method to this clinical problem that does not require wavefront reconstruction. Instead, we base our analysis of axial wavefront vergence as inferred directly from wavefront slopes. The result is a wavefront vergence map that is similar to the axial power maps in corneal topography and hence has a potential to be favoured by clinicians. We use our new set of orthogonal Zernike slope polynomials to systematically analyse details of the vergence map analogous to Zernike analysis of wavefront maps. The result is a vector of slope coefficients that describe fundamental aberration components. Three different methods for reducing slope coefficients to a spherocylindrical prescription in power vector forms are compared and contrasted. When the original wavefront contains only second order aberrations, the vergence map is a function of meridian only and the power vectors from all three methods are identical. The differences in the methods begin to appear as we include higher order aberrations, in which case the wavefront vergence map is more complicated. Finally, we discuss the advantages and limitations of vergence map representation of ocular aberrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The book is a joint effort of eight academics and journalists, Europe specialists from six countries (Australia, Germany, Poland, Slovenia, the United Kingdom and the United States). They give sometimes divergent views on the future of the so-called “European Project”, for building a common European economy and society, but agree that cultural changes, especially changes experienced through mass media, are rapidly taking place. One of the central interests of the book is the operation of the large media centre located at the European Commission in Brussels – the world’s largest gallery of permanently accredited correspondents. Jacket notes: The Lisbon Treaty of December 2009 is the latest success of the European Union’s drive to restructure and expand; yet questions persist about how democratic this new Europe might be. Will Brussels’ promotion of the “European idea” produce a common European culture and society? The authors consider it might, as a culture of everyday shared experience, though old ways are cherished, citizens forever thinking twice about committing to an uncertain future. The book focuses on mass media , as a prime agent of change, sometimes used deliberately to promote a “European project”; sometimes acting more naturally as a medium for new agendas. It looks at proposed media models for Europe, ranging from not very successful pan-European television, to the potentials of media systems based on national markets, and new media based on digital formats. It also studies the Brussels media service, the centre operated by the European Commission, which is the world’s largest concentration of journalists; and ways that dominant national media may come to serve the interests of communities now extending across frontiers. Europe and the Media notes change especially as encountered by new EU member countries of central and eastern Europe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unmanned Aerial Vehicles (UAVs) are emerging as an ideal platform for a wide range of civil applications such as disaster monitoring, atmospheric observation and outback delivery. However, the operation of UAVs is currently restricted to specially segregated regions of airspace outside of the National Airspace System (NAS). Mission Flight Planning (MFP) is an integral part of UAV operation that addresses some of the requirements (such as safety and the rules of the air) of integrating UAVs in the NAS. Automated MFP is a key enabler for a number of UAV operating scenarios as it aids in increasing the level of onboard autonomy. For example, onboard MFP is required to ensure continued conformance with the NAS integration requirements when there is an outage in the communications link. MFP is a motion planning task concerned with finding a path between a designated start waypoint and goal waypoint. This path is described with a sequence of 4 Dimensional (4D) waypoints (three spatial and one time dimension) or equivalently with a sequence of trajectory segments (or tracks). It is necessary to consider the time dimension as the UAV operates in a dynamic environment. Existing methods for generic motion planning, UAV motion planning and general vehicle motion planning cannot adequately address the requirements of MFP. The flight plan needs to optimise for multiple decision objectives including mission safety objectives, the rules of the air and mission efficiency objectives. Online (in-flight) replanning capability is needed as the UAV operates in a large, dynamic and uncertain outdoor environment. This thesis derives a multi-objective 4D search algorithm entitled Multi- Step A* (MSA*) based on the seminal A* search algorithm. MSA* is proven to find the optimal (least cost) path given a variable successor operator (which enables arbitrary track angle and track velocity resolution). Furthermore, it is shown to be of comparable complexity to multi-objective, vector neighbourhood based A* (Vector A*, an extension of A*). A variable successor operator enables the imposition of a multi-resolution lattice structure on the search space (which results in fewer search nodes). Unlike cell decomposition based methods, soundness is guaranteed with multi-resolution MSA*. MSA* is demonstrated through Monte Carlo simulations to be computationally efficient. It is shown that multi-resolution, lattice based MSA* finds paths of equivalent cost (less than 0.5% difference) to Vector A* (the benchmark) in a third of the computation time (on average). This is the first contribution of the research. The second contribution is the discovery of the additive consistency property for planning with multiple decision objectives. Additive consistency ensures that the planner is not biased (which results in a suboptimal path) by ensuring that the cost of traversing a track using one step equals that of traversing the same track using multiple steps. MSA* mitigates uncertainty through online replanning, Multi-Criteria Decision Making (MCDM) and tolerance. Each trajectory segment is modeled with a cell sequence that completely encloses the trajectory segment. The tolerance, measured as the minimum distance between the track and cell boundaries, is the third major contribution. Even though MSA* is demonstrated for UAV MFP, it is extensible to other 4D vehicle motion planning applications. Finally, the research proposes a self-scheduling replanning architecture for MFP. This architecture replicates the decision strategies of human experts to meet the time constraints of online replanning. Based on a feedback loop, the proposed architecture switches between fast, near-optimal planning and optimal planning to minimise the need for hold manoeuvres. The derived MFP framework is original and shown, through extensive verification and validation, to satisfy the requirements of UAV MFP. As MFP is an enabling factor for operation of UAVs in the NAS, the presented work is both original and significant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The word “queer” is a slippery one; its etymology is uncertain, and academic and popular usage attributes conflicting meanings to the word. By the mid-nineteenth century, “queer” was used as a pejorative term for a (male) homosexual. This negative connotation continues when it becomes a term for homophobic abuse. In recent years, “queer” has taken on additional uses: as an all encompassing term for culturally marginalised sexualities – gay, lesbian, trans, bi, and intersex (“GLBTI”) – and as a theoretical strategy which deconstructs binary oppositions that govern identity formation. Tracing its history, the Oxford English Dictionary notes that the earliest references to “queer” may have appeared in the sixteenth century. These early examples of queer carried negative connotations such as “vulgar,” “bad,” “worthless,” “strange,” or “odd” and such associations continued until the mid-twentieth century. The early nineteenth century, and perhaps earlier, employed “queer” as a verb, meaning to “to put out of order,” “to spoil”, “to interfere with”. The adjectival form also began to emerge during this time to refer to a person’s condition as being “not normal,” “out of sorts” or to cause a person “to feel queer” meaning “to disconcert, perturb, unsettle.” According to Eve Sedgwick (1993), “the word ‘queer’ itself means across – it comes from the Indo-European root – twerkw, which also yields the German quer (traverse), Latin torquere (to twist), English athwart . . . it is relational and strange.” Despite the gaps in the lineage and changes in usage, meaning and grammatical form, “queer” as a political and theoretical strategy has benefited from its diverse origins. It refuses to settle comfortably into a single classification, preferring instead to traverse several categories that would otherwise attempt to stabilise notions of chromosomal sex, gender and sexuality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To ascertain the effectiveness of object-centered three-dimensional representations for the modeling of corneal surfaces. Methods: Three-dimensional (3D) surface decomposition into series of basis functions including: (i) spherical harmonics, (ii) hemispherical harmonics, and (iii) 3D Zernike polynomials were considered and compared to the traditional viewer-centered representation of two-dimensional (2D) Zernike polynomial expansion for a range of retrospective videokeratoscopic height data from three clinical groups. The data were collected using the Medmont E300 videokeratoscope. The groups included 10 normal corneas with corneal astigmatism less than −0.75 D, 10 astigmatic corneas with corneal astigmatism between −1.07 D and 3.34 D (Mean = −1.83 D, SD = ±0.75 D), and 10 keratoconic corneas. Only data from the right eyes of the subjects were considered. Results: All object-centered decompositions led to significantly better fits to corneal surfaces (in terms of the RMS error values) than the corresponding 2D Zernike polynomial expansions with the same number of coefficients, for all considered corneal surfaces, corneal diameters (2, 4, 6, and 8 mm), and model orders (4th to 10th radial orders) The best results (smallest RMS fit error) were obtained with spherical harmonics decomposition which lead to about 22% reduction in the RMS fit error, as compared to the traditional 2D Zernike polynomials. Hemispherical harmonics and the 3D Zernike polynomials reduced the RMS fit error by about 15% and 12%, respectively. Larger reduction in RMS fit error was achieved for smaller corneral diameters and lower order fits. Conclusions: Object-centered 3D decompositions provide viable alternatives to traditional viewer-centered 2D Zernike polynomial expansion of a corneal surface. They achieve better fits to videokeratoscopic height data and could be particularly suited to the analysis of multiple corneal measurements, where there can be slight variations in the position of the cornea from one map acquisition to the next.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While my PhD is practice-led research, it is my contention that such an inquiry cannot develop as long as it tries to emulate other models of research. I assert that practice-led research needs to account for an epistemological unknown or uncertainty central to the practice of art. By focusing on what I call the artist's 'voice,' I will show how this 'voice' is comprised of a dual motivation—'articulate' representation and 'inarticulate' affect—which do not even necessarily derive from the artist. Through an analysis of art-historical precedents, critical literature (the work of Jean-François Lyotard and Andrew Benjamin, the critical methods of philosophy, phenomenology and psychoanalysis) as well as of my own painting and digital arts practice, I aim to demonstrate how this unknown or uncertain aspect of artistic inquiry can be mapped. It is my contention that practice-led research needs to address and account for this dualistic 'voice' in order to more comprehensively articulate its unique contribution to research culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Key points • The clinical aims of MR spectroscopy (MRS) in seizure disorders are to help identify, localize and characterize epileptogenic foci. • Lateralizing MRS abnormalities in temporal lobe epilepsy (TLE) may be used clinically in combination with structural and T2 MRI measurements together with other techniques such as EEG, PET and SPECT. • Characteristic metabolite abnormalities are decreased N-acetylaspartate (NAA) with increased choline (Cho) and myoinositol (mI) (short-echo time). • Contralateral metabolite abnormalities are frequently seen in TLE, but are of uncertain significance. • In extra-temporal epilepsy, metabolite abnormalities may be seen where MR imaging (MRI) is normal; but may not be sufficiently localized to be useful clinically. • MRS may help to characterize epileptogenic lesions visible on MRI (aggressive vs. indolent neoplastic, dysplasia). • Spectral editing techniques are required to evaluate specific epilepsy-relevant metabolites (e.g. -aminobutyric acid (GABA)), which may be useful in drug development and evaluation. • MRS with phosphorus (31P) and other nuclei probe metabolism of epilepsy, but are less useful clinically. • There is potential for assessing the of drug mode of action and efficacy through 13C carbon metabolite measurements, while changes in sodium homeostasis resulting from seizure activity may be detected with 23Na MRS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

CFD has been successfully used in the optimisation of aerodynamic surfaces using a given set of parameters such as Mach numbers and angle of attack. While carrying out a multidisciplinary design optimisation one deals with situations where the parameters have some uncertain attached. Any optimisation carried out for fixed values of input parameters gives a design which may be totally unacceptable under off-design conditions. The challenge is to develop a robust design procedure which takes into account the fluctuations in the input parameters. In this work, we attempt this using a modified Taguchi approach. This is incorporated into an evolutionary algorithm with many features developed in house. The method is tested for an UCAV design which simultaneously handles aerodynamics, electromagnetics and maneuverability. Results demonstrate that the method has considerable potential.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The over representation of novice drivers in crashes is alarming. Research indicates that one in five drivers’ crashes within their first year of driving. Driver training is one of the interventions aimed at decreasing the number of crashes that involve young drivers. Currently, there is a need to develop comprehensive driver evaluation system that benefits from the advances in Driver Assistance Systems. Since driving is dependent on fuzzy inputs from the driver (i.e. approximate distance calculation from the other vehicles, approximate assumption of the other vehicle speed), it is necessary that the evaluation system is based on criteria and rules that handles uncertain and fuzzy characteristics of the drive. This paper presents a system that evaluates the data stream acquired from multiple in-vehicle sensors (acquired from Driver Vehicle Environment-DVE) using fuzzy rules and classifies the driving manoeuvres (i.e. overtake, lane change and turn) as low risk or high risk. The fuzzy rules use parameters such as following distance, frequency of mirror checks, gaze depth and scan area, distance with respect to lanes and excessive acceleration or braking during the manoeuvre to assess risk. The fuzzy rules to estimate risk are designed after analysing the selected driving manoeuvres performed by driver trainers. This paper focuses mainly on the difference in gaze pattern for experienced and novice drivers during the selected manoeuvres. Using this system, trainers of novice drivers would be able to empirically evaluate and give feedback to the novice drivers regarding their driving behaviour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lawmakers are asking whether Australian researchers need an express 'experimental use' defense against patent infringement. The overriding policy for establishing a patent system is indisputably the promotion of innovation. According to traditional intellectual property pedagogy, the incentive to innovate flows from the reward afforded to the inventor. A balancing policy is that the patentee must fully disclose the invention to help minimize the risks of duplication and provides a basis for improvements by further research.Where there is uncertainty as to how these competing policy limbs are balanced and whether a patentee can exclude others from experimenting on a patented invention, the uncertain legal environment disadvantages both the patentee and researcher. Different jurisdictions have treated the experimental use question quite differently with varied results for the researcher. The biotechnology industry is evolving at an unprecedented pace and the law will as is always the case, lag behind in its usual cautious fashion. The Australian law may finally catch up to researchers' concerns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A decade ago, Queensland University of Technology (QUT) developed an innovative annual Courses Performance Report, but through incremental change, this report became quite labour-intensive. A new risk-based approach to course quality assurance, that consolidates voluminous data in a simple dashboard, responds to the changing context of the higher education sector. This paper will briefly describe QUT’s context and outline the second phase of implementation of this new approach to course quality assurance. The main components are: Individual Course Reports (ICRs), the Consolidated Courses Performance Report (CCPR), Underperforming Courses Status Update and the Strategic Faculty Courses Update (SFCU). These components together form a parsimonious and strategic annual cycle of reporting and place QUT in a positive position to respond to future sector change

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim. The paper is a report of a study to demonstrate how the use of schematics can provide procedural clarity and promote rigour in the conduct of case study research. Background. Case study research is a methodologically flexible approach to research design that focuses on a particular case – whether an individual, a collective or a phenomenon of interest. It is known as the 'study of the particular' for its thorough investigation of particular, real-life situations and is gaining increased attention in nursing and social research. However, the methodological flexibility it offers can leave the novice researcher uncertain of suitable procedural steps required to ensure methodological rigour. Method. This article provides a real example of a case study research design that utilizes schematic representation drawn from a doctoral study of the integration of health promotion principles and practices into a palliative care organization. Discussion. The issues discussed are: (1) the definition and application of case study research design; (2) the application of schematics in research; (3) the procedural steps and their contribution to the maintenance of rigour; and (4) the benefits and risks of schematics in case study research. Conclusion. The inclusion of visual representations of design with accompanying explanatory text is recommended in reporting case study research methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

If one clear argument emerged from my doctoral thesis in political science, it is that there is no agreement as to what democracy is. There are over 40 different varieties of democracy ranging from those in the mainstream with subtle or minute differences to those playing by themselves in the corner. And many of these various types of democracy are very well argued, empirically supported, and highly relevant to certain polities. The irony is that the thing which all of these democratic varieties or the ‘basic democracy’ that all other forms of democracy stem from, is elusive. There is no international agreement in the literature or in political practice as to what ‘basic democracy’ is and that is problematic as many of us use the word ‘democracy’ every day and it is a concept of tremendous importance internationally. I am still uncertain as to why this problem has not been resolved before by far greater minds than my own, and it may have something to do with the recent growth in democratic theory this past decade and the innovative areas of thought my thesis required, but I think I’ve got the answer. By listing each type of democracy and filling the column next to this list with the literature associated with these various styles of democracy, I amassed a large and comprehensive body of textual data. My research intended to find out what these various styles of democracy had in common and to create a taxonomy (like the ‘tree of life’ in biology) of democracy to attempt at showing how various styles of democracy have ‘evolved’ over the past 5000 years.ii I then ran a word frequency analysis program or a piece of software that counts the 100 most commonly used words in the texts. This is where my logic came in as I had to make sense of these words. How did they answer what the most fundamental commonalities are between 40 different styles of democracy? I used a grounded theory analysis which required that I argue my way through these words to form a ‘theory’ or plausible explanation as to why these particular words and not others are the important ones for answering the question. It came down to the argument that all 40 styles of democracy analysed have the following in common 1) A concept of a citizenry. 2) A concept of sovereignty. 3) A concept of equality. 4) A concept of law. 5) A concept of communication. 6) And a concept of selecting officials. Thus, democracy is a defined citizenry with its own concept of sovereignty which it exercises through the institutions which support the citizenry’s understandings of equality, law, communication, and the selection of officials. Once any of these 6 concepts are defined in a particular way it creates a style of democracy. From this, we can also see that there can be more than one style of democracy active in a particular government as a citizenry is composed of many different aggregates with their own understandings of the six concepts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently it has been shown that the consumption of a diet high in saturated fat is associated with impaired insulin sensitivity and increased incidence of type 2 diabetes. In contrast, diets that are high in monounsaturated fatty acids (MUFAs) or polyunsaturated fatty acids (PUFAs), especially very long chain n-3 fatty acids (FAs), are protective against disease. However, the molecular mechanisms by which saturated FAs induce the insulin resistance and hyperglycaemia associated with metabolic syndrome and type 2 diabetes are not clearly defined. It is possible that saturated FAs may act through alternative mechanisms compared to MUFA and PUFA to regulate of hepatic gene expression and metabolism. It is proposed that, like MUFA and PUFA, saturated FAs regulate the transcription of target genes. To test this hypothesis, hepatic gene expression analysis was undertaken in a human hepatoma cell line, Huh-7, after exposure to the saturated FA, palmitate. These experiments showed that palmitate is an effective regulator of gene expression for a wide variety of genes. A total of 162 genes were differentially expressed in response to palmitate. These changes not only affected the expression of genes related to nutrient transport and metabolism, they also extend to other cellular functions including, cytoskeletal architecture, cell growth, protein synthesis and oxidative stress response. In addition, this thesis has shown that palmitate exposure altered the expression patterns of several genes that have previously been identified in the literature as markers of risk of disease development, including CVD, hypertension, obesity and type 2 diabetes. The altered gene expression patterns associated with an increased risk of disease include apolipoprotein-B100 (apo-B100), apo-CIII, plasminogen activator inhibitor 1, insulin-like growth factor-I and insulin-like growth factor binding protein 3. This thesis reports the first observation that palmitate directly signals in cultured human hepatocytes to regulate expression of genes involved in energy metabolism as well as other important genes. Prolonged exposure to long-chain saturated FAs reduces glucose phosphorylation and glycogen synthesis in the liver. Decreased glucose metabolism leads to elevated rates of lipolysis, resulting in increased release of free FAs. Free FAs have a negative effect on insulin action on the liver, which in turn results in increased gluconeogenesis and systemic dyslipidaemia. It has been postulated that disruption of glucose transport and insulin secretion by prolonged excessive FA availability might be a non-genetic factor that has contributed to the staggering rise in prevalence of type 2 diabetes. As glucokinase (GK) is a key regulatory enzyme of hepatic glucose metabolism, changes in its activity may alter flux through the glycolytic and de novo lipogenic pathways and result in hyperglycaemia and ultimately insulin resistance. This thesis investigated the effects of saturated FA on the promoter activity of the glycolytic enzyme, GK, and various transcription factors that may influence the regulation of GK gene expression. These experiments have shown that the saturated FA, palmitate, is capable of decreasing GK promoter activity. In addition, quantitative real-time PCR has shown that palmitate incubation may also regulate GK gene expression through a known FA sensitive transcription factor, sterol regulatory element binding protein-1c (SREBP-1c), which upregulates GK transcription. To parallel the investigations into the mechanisms of FA molecular signalling, further studies of the effect of FAs on metabolic pathway flux were performed. Although certain FAs reduce SREBP-1c transcription in vitro, it is unclear whether this will result in decreased GK activity in vivo where positive effectors of SREBP-1c such as insulin are also present. Under these conditions, it is uncertain if the inhibitory effects of FAs would be overcome by insulin. The effects of a combination of FAs, insulin and glucose on glucose phosphorylation and metabolism in cultured primary rat hepatocytes at concentrations that mimic those in the portal circulation after a meal was examined. It was found that total GK activity was unaffected by an increased concentration of insulin, but palmitate and eicosapentaenoic acid significantly lowered total GK activity in the presence of insulin. Despite the fact that total GK enzyme activity was reduced in response to FA incubation, GK enzyme translocation from the inactive, nuclear bound, to active, cytoplasmic state was unaffected. Interestingly, none of the FAs tested inhibited glucose phosphorylation or the rate of glycolysis when insulin is present. These results suggest that in the presence of insulin the levels of the active, unbound cytoplasmic GK are sufficient to buffer a slight decrease in GK enzyme activity and decreased promoter activity caused by FA exposure. Although a high fat diet has been associated with impaired hepatic glucose metabolism, there is no evidence from this thesis that FAs themselves directly modulate flux through the glycolytic pathway in isolated primary hepatocytes when insulin is also present. Therefore, although FA affected expression of a wide range of genes, including GK, this did not affect glycolytic flux in the presence of insulin. However, it may be possible that a saturated FA-induced decrease in GK enzyme activity when combined with the onset of insulin resistance may promote the dys-regulation of glucose homeostasis and the subsequent development of hyperglycaemia, metabolic syndrome and type 2 diabetes.