410 resultados para Second Nature


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increased industrialisation has brought to the forefront the susceptibility of concrete columns in both buildings and bridges to vehicle impacts. Accurate vulnerability assessments are crucial in the design process due to possible catastrophic nature of the failures that can cause. This paper reports on research undertaken to investigate the impact capacity of the columns of low to medium raised building designed according to Australian Standards. Numerical simulation techniques were used in the process and validation was done by using experimental results published in the literature. The investigation thus far has confirmed that vulnerability of typical columns in five story buildings located in urban areas to medium velocity car impacts and hence these columns need to be re-designed (if possible) or retrofitted. In addition, accuracy of the simplified method presented in EN 1991 to quantify the impact damage was scrutinised. A simplified concept to assess the damage due to all collisions modes was introduced. The research information will be extended to generate a common data base to assess the vulnerability of columns in urban areas against new generation of vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

South Africa's modern architecture is not confined to the cities, but the ideas of the movement were mostly disseminated by architects and academics in Johannesburg, Pretoria, Durban and Cape Town, its four major urban centres. The lay out of significant areas of each city was also influenced by international modernist plans. In outlining the achievements and innovative designs of architects in these cities between the 1930s and 1970s, this article draws a picture of the importance of modernism in South African urban space, and of its diversity. It also draws attention to the political nature of the South African landscape and space, where modernist design was used for racial purposes, and to past and present conservation ideologies. The second part of the article concerns the conservation of modern buildings in these centres; it quotes bibliographies and lists the registers, those existing or under construction. It concludes with an overview of the conservation legislation in place and the challenges of conservation in a context of changing cultural values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter describes the nature of human language and introduces theories of language acquisition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study explores young people's creative practice through using Information and Communications Technologies (ICTs) - in one particular learning area - Drama. The study focuses on school-based contexts and the impact of ICT-based interventions within two drama education case studies. The first pilot study involved the use of online spaces to complement a co-curricula performance project. The second focus case was a curriculum-based project with online spaces and digital technologies being used to create a cyberdrama. Each case documents the activity systems, participant experiences and meaning making in specific institutional and technological contexts. The nature of creative practice and learning are analysed, using frameworks drawn from Vygotsky's socio-historical theory (including his work on creativity) and from activity theory. Case study analysis revealed the nature of contradictions encountered and these required an analysis of institutional constraints and the dynamics of power. Cyberdrama offers young people opportunities to explore drama through new modes and the use of ICTs can be seen as contributing different tools, spaces and communities for creative activity. To be able to engage in creative practice using ICTs requires a focus on a range of cultural tools and social practices beyond those of the purely technological. Cybernetic creative practice requires flexibility in the negotiation of tool use and subjects and a system that responds to feedback and can adapt. Classroom-based dramatic practice may allow for the negotiation of power and tool use in the development of collaborative works of the imagination. However, creative practice using ICTs in schools is typically restricted by authoritative power structures and access issues. The research identified participant engagement and meaning making emerging from different factors, with some students showing preferences for embodied creative practice in Drama that did not involve ICTs. The findings of the study suggest ICT-based interventions need to focus on different applications for the technology but also on embodied experience, the negotiation of power, identity and human interactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2009, Religious Education is a designated key learning area in Catholic schools in the Archdiocese of Brisbane and, indeed, across Australia. Over the years, though, different conceptualisations of the nature and purpose of religious education have led to the construction of different approaches to the classroom teaching of religion. By investigating the development of religious education policy in the Archdiocese of Brisbane from 1984 to 2003, the study seeks to trace the emergence of new discourses on religious education. The study understands religious education to refer to a lifelong process that occurs through a variety of forms (Moran, 1989). In Catholic schools, it refers both to co-curricula activities, such as retreats and school liturgies, and the classroom teaching of religion. It is the policy framework for the classroom teaching of religion that this study explores. The research was undertaken using a policy case study approach to gain a detailed understanding of how new conceptualisations of religious education emerged at a particular site of policy production, in this case, the Archdiocese of Brisbane. The study draws upon Yeatman’s (1998) description of policy as occurring “when social actors think about what they are doing and why in relation to different and alternative possible futures” (p. 19) and views policy as consisting of more than texts themselves. Policy texts result from struggles over meaning (Taylor, 2004) in which specific discourses are mobilised to support particular views. The study has a particular interest in the analysis of Brisbane religious education policy texts, the discursive practices that surrounded them, and the contexts in which they arose. Policy texts are conceptualised in the study as representing “temporary settlements” (Gale, 1999). Such settlements are asymmetrical, temporary and dependent on context: asymmetrical in that dominant actors are favoured; temporary because dominant actors are always under challenge by other actors in the policy arena; and context - dependent because new situations require new settlements. To investigate the official policy documents, the study used Critical Discourse Analysis (hereafter referred to as CDA) as a research tool that affords the opportunity for researchers to map and chart the emergence of new discourses within the policy arena. As developed by Fairclough (2001), CDA is a three-dimensional application of critical analysis to language. In the Brisbane religious education arena, policy texts formed a genre chain (Fairclough, 2004; Taylor, 2004) which was a focus of the study. There are two features of texts that form genre chains: texts are systematically linked to one another; and, systematic relations of recontextualisation exist between the texts. Fairclough’s (2005) concepts of “imaginary space” and “frameworks for action” (p. 65) within the policy arena were applied to the Brisbane policy arena to investigate the relationship between policy statements and subsequent guidelines documents. Five key findings emerged from the study. First, application of CDA to policy documents revealed that a fundamental reconceptualisation of the nature and purpose of classroom religious education in Catholic schools occurred in the Brisbane policy arena over the last twenty-five years. Second, a disjuncture existed between catechetical discourses that continued to shape religious education policy statements, and educational discourses that increasingly shaped guidelines documents. Third, recontextualisation between policy documents was evident and dependent on the particular context in which religious education occurred. Fourth, at subsequent links in the chain, actors created their own “imaginary space”, thereby altering orders of discourse within the policy arena, with different actors being either foregrounded or marginalised. Fifth, intertextuality was more evident in the later links in the genre chain (i.e. 1994 policy statement and 1997 guidelines document) than in earlier documents. On the basis of the findings of the study, six recommendations are made. First, the institutional Church should carefully consider the contribution that the Catholic school can make to the overall pastoral mission of the diocese in twenty-first century Australia. Second, policymakers should articulate a nuanced understanding of the relationship between catechesis and education with regard to the religion classroom. Third, there should be greater awareness of the connections among policies relating to Catholic schools – especially the connection between enrolment policy and religious education policy. Fourth, there should be greater consistency between policy documents. Fifth, policy documents should be helpful for those to whom they are directed (i.e. Catholic schools, teachers). Sixth, “imaginary space” (Fairclough, 2005) in policy documents needs to be constructed in a way that allows for multiple “frameworks for action” (Fairclough, 2005) through recontextualisation. The findings of this study are significant in a number of ways. For religious educators, the study highlights the need to develop a shared understanding of the nature and purpose of classroom religious education. It argues that this understanding must take into account the multifaith nature of Australian society and the changing social composition of Catholic schools themselves. Greater recognition should be given to the contribution that religious studies courses such as Study of Religion make to the overall religious development of a person. In view of the social composition of Catholic schools, there is also an issue of ecclesiological significance concerning the conceptualisation of the relationship between the institutional Catholic Church and Catholic schools. Finally, the study is of significance because of its application of CDA to religious education policy documents. Use of CDA reveals the foregrounding, marginalising, or excluding of various actors in the policy arena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Actions for wrongful life, as they have come unfortunately to be styled, encompass various types of claim. These include claims for alleged negligence after conception, those based on negligent advice or diagnosis prior to conception concerning possible effects of treatment given to the child's mother, contraception or sterilisation, or genetic disability. This distinguishes such claims from those for so called wrongful birth, which are claims by parents for the cost of raising either a healthy or a disabled child where the unplanned birth imposes costs on the parents as a result of clinical negligence. Two of the more controversial cases to have reached the High Court of Australia which are if interest to us here in the past decade are Cattanach v Melchior where the court, by a narrow majority (McHugh, Gummow, Kirby and Callinan JJ; Gleeson CJ, Hayne and Heydon dissenting) acknowledged recovery for wrongful birth. In the second joined appeals of Harriton v Stephens and Waller v James; Waller v Hoolahan the court overwhelmingly precluded a ‘wrongful life’ claim (Gleeson CJ, Gummow, Hayne, Callinan, Heydon and Crennan JJ; Kirby J dissenting). Both cases raised issues around the sanctity and value of life and the nature of harm and the assessment of damages, and this brief note affords us the opportunity to consider the way in which the ‘life as legal loss’ arguments were treated by the various judges in both cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Uninhabited aerial vehicles (UAVs) are a cutting-edge technology that is at the forefront of aviation/aerospace research and development worldwide. Many consider their current military and defence applications as just a token of their enormous potential. Unlocking and fully exploiting this potential will see UAVs in a multitude of civilian applications and routinely operating alongside piloted aircraft. The key to realising the full potential of UAVs lies in addressing a host of regulatory, public relation, and technological challenges never encountered be- fore. Aircraft collision avoidance is considered to be one of the most important issues to be addressed, given its safety critical nature. The collision avoidance problem can be roughly organised into three areas: 1) Sense; 2) Detect; and 3) Avoid. Sensing is concerned with obtaining accurate and reliable information about other aircraft in the air; detection involves identifying potential collision threats based on available information; avoidance deals with the formulation and execution of appropriate manoeuvres to maintain safe separation. This thesis tackles the detection aspect of collision avoidance, via the development of a target detection algorithm that is capable of real-time operation onboard a UAV platform. One of the key challenges of the detection problem is the need to provide early warning. This translates to detecting potential threats whilst they are still far away, when their presence is likely to be obscured and hidden by noise. Another important consideration is the choice of sensors to capture target information, which has implications for the design and practical implementation of the detection algorithm. The main contributions of the thesis are: 1) the proposal of a dim target detection algorithm combining image morphology and hidden Markov model (HMM) filtering approaches; 2) the novel use of relative entropy rate (RER) concepts for HMM filter design; 3) the characterisation of algorithm detection performance based on simulated data as well as real in-flight target image data; and 4) the demonstration of the proposed algorithm's capacity for real-time target detection. We also consider the extension of HMM filtering techniques and the application of RER concepts for target heading angle estimation. In this thesis we propose a computer-vision based detection solution, due to the commercial-off-the-shelf (COTS) availability of camera hardware and the hardware's relatively low cost, power, and size requirements. The proposed target detection algorithm adopts a two-stage processing paradigm that begins with an image enhancement pre-processing stage followed by a track-before-detect (TBD) temporal processing stage that has been shown to be effective in dim target detection. We compare the performance of two candidate morphological filters for the image pre-processing stage, and propose a multiple hidden Markov model (MHMM) filter for the TBD temporal processing stage. The role of the morphological pre-processing stage is to exploit the spatial features of potential collision threats, while the MHMM filter serves to exploit the temporal characteristics or dynamics. The problem of optimising our proposed MHMM filter has been examined in detail. Our investigation has produced a novel design process for the MHMM filter that exploits information theory and entropy related concepts. The filter design process is posed as a mini-max optimisation problem based on a joint RER cost criterion. We provide proof that this joint RER cost criterion provides a bound on the conditional mean estimate (CME) performance of our MHMM filter, and this in turn establishes a strong theoretical basis connecting our filter design process to filter performance. Through this connection we can intelligently compare and optimise candidate filter models at the design stage, rather than having to resort to time consuming Monte Carlo simulations to gauge the relative performance of candidate designs. Moreover, the underlying entropy concepts are not constrained to any particular model type. This suggests that the RER concepts established here may be generalised to provide a useful design criterion for multiple model filtering approaches outside the class of HMM filters. In this thesis we also evaluate the performance of our proposed target detection algorithm under realistic operation conditions, and give consideration to the practical deployment of the detection algorithm onboard a UAV platform. Two fixed-wing UAVs were engaged to recreate various collision-course scenarios to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. Based on this collected data, our proposed detection approach was able to detect targets out to distances ranging from about 400m to 900m. These distances, (with some assumptions about closing speeds and aircraft trajectories) translate to an advanced warning ahead of impact that approaches the 12.5 second response time recommended for human pilots. Furthermore, readily available graphic processing unit (GPU) based hardware is exploited for its parallel computing capabilities to demonstrate the practical feasibility of the proposed target detection algorithm. A prototype hardware-in- the-loop system has been found to be capable of achieving data processing rates sufficient for real-time operation. There is also scope for further improvement in performance through code optimisations. Overall, our proposed image-based target detection algorithm offers UAVs a cost-effective real-time target detection capability that is a step forward in ad- dressing the collision avoidance issue that is currently one of the most significant obstacles preventing widespread civilian applications of uninhabited aircraft. We also highlight that the algorithm development process has led to the discovery of a powerful multiple HMM filtering approach and a novel RER-based multiple filter design process. The utility of our multiple HMM filtering approach and RER concepts, however, extend beyond the target detection problem. This is demonstrated by our application of HMM filters and RER concepts to a heading angle estimation problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As Web searching becomes more prolific for information access worldwide, we need to better understand users’ Web searching behaviour and develop better models of their interaction with Web search systems. Web search modelling is a significant and important area of Web research. Searching on the Web is an integral element of information behaviour and human–computer interaction. Web searching includes multitasking processes, the allocation of cognitive resources among several tasks, and shifts in cognitive, problem and knowledge states. In addition to multitasking, cognitive coordination and cognitive shifts are also important, but are under-explored aspects of Web searching. During the Web searching process, beyond physical actions, users experience various cognitive activities. Interactive Web searching involves many users’ cognitive shifts at different information behaviour levels. Cognitive coordination allows users to trade off the dependences among multiple information tasks and the resources available. Much research has been conducted into Web searching. However, few studies have modelled the nature of and relationship between multitasking, cognitive coordination and cognitive shifts in the Web search context. Modelling how Web users interact with Web search systems is vital for the development of more effective Web IR systems. This study aims to model the relationship between multitasking, cognitive coordination and cognitive shifts during Web searching. A preliminary theoretical model is presented based on previous studies. The research is designed to validate the preliminary model. Forty-two study participants were involved in the empirical study. A combination of data collection instruments, including pre- and post-questionnaires, think-aloud protocols, search logs, observations and interviews were employed to obtain users’ comprehensive data during Web search interactions. Based on the grounded theory approach, qualitative analysis methods including content analysis and verbal protocol analysis were used to analyse the data. The findings were inferred through an analysis of questionnaires, a transcription of think-aloud protocols, the Web search logs, and notes on observations and interviews. Five key findings emerged. (1) Multitasking during Web searching was demonstrated as a two-dimensional behaviour. The first dimension was represented as multiple information problems searching by task switching. Users’ Web searching behaviour was a process of multiple tasks switching, that is, from searching on one information problem to searching another. The second dimension of multitasking behaviour was represented as an information problem searching within multiple Web search sessions. Users usually conducted Web searching on a complex information problem by submitting multiple queries, using several Web search systems and opening multiple windows/tabs. (2) Cognitive shifts were the brain’s internal response to external stimuli. Cognitive shifts were found as an essential element of searching interactions and users’ Web searching behaviour. The study revealed two kinds of cognitive shifts. The first kind, the holistic shift, included users’ perception on the information problem and overall information evaluation before and after Web searching. The second kind, the state shift, reflected users’ changes in focus between the different cognitive states during the course of Web searching. Cognitive states included users’ focus on the states of topic, strategy, evaluation, view and overview. (3) Three levels of cognitive coordination behaviour were identified: the information task coordination level, the coordination mechanism level, and the strategy coordination level. The three levels of cognitive coordination behaviour interplayed to support multiple information tasks switching. (4) An important relationship existed between multitasking, cognitive coordination and cognitive shifts during Web searching. Cognitive coordination as a management mechanism bound together other cognitive processes, including multitasking and cognitive shifts, in order to move through users’ Web searching process. (5) Web search interaction was shown to be a multitasking process which included information problems ordering, task switching and task and mental coordinating; also, at a deeper level, cognitive shifts took place. Cognitive coordination was the hinge behaviour linking multitasking and cognitive shifts. Without cognitive coordination, neither multitasking Web searching behaviour nor the complicated mental process of cognitive shifting could occur. The preliminary model was revisited with these empirical findings. A revised theoretical model (MCC Model) was built to illustrate the relationship between multitasking, cognitive coordination and cognitive shifts during Web searching. Implications and limitations of the study are also discussed, along with future research work.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis deals with the problem of the instantaneous frequency (IF) estimation of sinusoidal signals. This topic plays significant role in signal processing and communications. Depending on the type of the signal, two major approaches are considered. For IF estimation of single-tone or digitally-modulated sinusoidal signals (like frequency shift keying signals) the approach of digital phase-locked loops (DPLLs) is considered, and this is Part-I of this thesis. For FM signals the approach of time-frequency analysis is considered, and this is Part-II of the thesis. In part-I we have utilized sinusoidal DPLLs with non-uniform sampling scheme as this type is widely used in communication systems. The digital tanlock loop (DTL) has introduced significant advantages over other existing DPLLs. In the last 10 years many efforts have been made to improve DTL performance. However, this loop and all of its modifications utilizes Hilbert transformer (HT) to produce a signal-independent 90-degree phase-shifted version of the input signal. Hilbert transformer can be realized approximately using a finite impulse response (FIR) digital filter. This realization introduces further complexity in the loop in addition to approximations and frequency limitations on the input signal. We have tried to avoid practical difficulties associated with the conventional tanlock scheme while keeping its advantages. A time-delay is utilized in the tanlock scheme of DTL to produce a signal-dependent phase shift. This gave rise to the time-delay digital tanlock loop (TDTL). Fixed point theorems are used to analyze the behavior of the new loop. As such TDTL combines the two major approaches in DPLLs: the non-linear approach of sinusoidal DPLL based on fixed point analysis, and the linear tanlock approach based on the arctan phase detection. TDTL preserves the main advantages of the DTL despite its reduced structure. An application of TDTL in FSK demodulation is also considered. This idea of replacing HT by a time-delay may be of interest in other signal processing systems. Hence we have analyzed and compared the behaviors of the HT and the time-delay in the presence of additive Gaussian noise. Based on the above analysis, the behavior of the first and second-order TDTLs has been analyzed in additive Gaussian noise. Since DPLLs need time for locking, they are normally not efficient in tracking the continuously changing frequencies of non-stationary signals, i.e. signals with time-varying spectra. Nonstationary signals are of importance in synthetic and real life applications. An example is the frequency-modulated (FM) signals widely used in communication systems. Part-II of this thesis is dedicated for the IF estimation of non-stationary signals. For such signals the classical spectral techniques break down, due to the time-varying nature of their spectra, and more advanced techniques should be utilized. For the purpose of instantaneous frequency estimation of non-stationary signals there are two major approaches: parametric and non-parametric. We chose the non-parametric approach which is based on time-frequency analysis. This approach is computationally less expensive and more effective in dealing with multicomponent signals, which are the main aim of this part of the thesis. A time-frequency distribution (TFD) of a signal is a two-dimensional transformation of the signal to the time-frequency domain. Multicomponent signals can be identified by multiple energy peaks in the time-frequency domain. Many real life and synthetic signals are of multicomponent nature and there is little in the literature concerning IF estimation of such signals. This is why we have concentrated on multicomponent signals in Part-H. An adaptive algorithm for IF estimation using the quadratic time-frequency distributions has been analyzed. A class of time-frequency distributions that are more suitable for this purpose has been proposed. The kernels of this class are time-only or one-dimensional, rather than the time-lag (two-dimensional) kernels. Hence this class has been named as the T -class. If the parameters of these TFDs are properly chosen, they are more efficient than the existing fixed-kernel TFDs in terms of resolution (energy concentration around the IF) and artifacts reduction. The T-distributions has been used in the IF adaptive algorithm and proved to be efficient in tracking rapidly changing frequencies. They also enables direct amplitude estimation for the components of a multicomponent

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of radar was developed for the estimation of the distance (range) and velocity of a target from a receiver. The distance measurement is obtained by measuring the time taken for the transmitted signal to propagate to the target and return to the receiver. The target's velocity is determined by measuring the Doppler induced frequency shift of the returned signal caused by the rate of change of the time- delay from the target. As researchers further developed conventional radar systems it become apparent that additional information was contained in the backscattered signal and that this information could in fact be used to describe the shape of the target itself. It is due to the fact that a target can be considered to be a collection of individual point scatterers, each of which has its own velocity and time- delay. DelayDoppler parameter estimation of each of these point scatterers thus corresponds to a mapping of the target's range and cross range, thus producing an image of the target. Much research has been done in this area since the early radar imaging work of the 1960s. At present there are two main categories into which radar imaging falls. The first of these is related to the case where the backscattered signal is considered to be deterministic. The second is related to the case where the backscattered signal is of a stochastic nature. In both cases the information which describes the target's scattering function is extracted by the use of the ambiguity function, a function which correlates the backscattered signal in time and frequency with the transmitted signal. In practical situations, it is often necessary to have the transmitter and the receiver of the radar system sited at different locations. The problem in these situations is 'that a reference signal must then be present in order to calculate the ambiguity function. This causes an additional problem in that detailed phase information about the transmitted signal is then required at the receiver. It is this latter problem which has led to the investigation of radar imaging using time- frequency distributions. As will be shown in this thesis, the phase information about the transmitted signal can be extracted from the backscattered signal using time- frequency distributions. The principle aim of this thesis was in the development, and subsequent discussion into the theory of radar imaging, using time- frequency distributions. Consideration is first given to the case where the target is diffuse, ie. where the backscattered signal has temporal stationarity and a spatially white power spectral density. The complementary situation is also investigated, ie. where the target is no longer diffuse, but some degree of correlation exists between the time- frequency points. Computer simulations are presented to demonstrate the concepts and theories developed in the thesis. For the proposed radar system to be practically realisable, both the time- frequency distributions and the associated algorithms developed must be able to be implemented in a timely manner. For this reason an optical architecture is proposed. This architecture is specifically designed to obtain the required time and frequency resolution when using laser radar imaging. The complex light amplitude distributions produced by this architecture have been computer simulated using an optical compiler.