851 resultados para The Real
Resumo:
National meteorological offices are largely concerned with synoptic-scale forecasting where weather predictions are produced for a whole country for 24 hours ahead. In practice, many local organisations (such as emergency services, construction industries, forestry, farming, and sports) require only local short-term, bespoke, weather predictions and warnings. This thesis shows that the less-demanding requirements do not require exceptional computing power and can be met by a modern, desk-top system which monitors site-specific ground conditions (such as temperature, pressure, wind speed and direction, etc) augmented with above ground information from satellite images to produce `nowcasts'. The emphasis in this thesis has been towards the design of such a real-time system for nowcasting. Local site-specific conditions are monitored using a custom-built, stand alone, Motorola 6809 based sub-system. Above ground information is received from the METEOSAT 4 geo-stationary satellite using a sub-system based on a commercially available equipment. The information is ephemeral and must be captured in real-time. The real-time nowcasting system for localised weather handles the data as a transparent task using the limited capabilities of the PC system. Ground data produces a time series of measurements at a specific location which represents the past-to-present atmospheric conditions of the particular site from which much information can be extracted. The novel approach adopted in this thesis is one of constructing stochastic models based on the AutoRegressive Integrated Moving Average (ARIMA) technique. The satellite images contain features (such as cloud formations) which evolve dynamically and may be subject to movement, growth, distortion, bifurcation, superposition, or elimination between images. The process of extracting a weather feature, following its motion and predicting its future evolution involves algorithms for normalisation, partitioning, filtering, image enhancement, and correlation of multi-dimensional signals in different domains. To limit the processing requirements, the analysis in this thesis concentrates on an `area of interest'. By this rationale, only a small fraction of the total image needs to be processed, leading to a major saving in time. The thesis also proposes an extention to an existing manual cloud classification technique for its implementation in automatically classifying a cloud feature over the `area of interest' for nowcasting using the multi-dimensional signals.
Resumo:
Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.
Resumo:
Background: The purpose of this study was to investigate the 12-month outcome of macular edema secondary to both chronic and new central and branch retinal vein occlusions treated with intravitreal bevacizumab in the real-life clinical setting in the UK. Methods: Retrospective case notes analysis of consecutive patients with retinal vein occlusions treated with bevacizumab in 2010 to 2012. Outcome measures were visual acuity (measured with Snellen, converted into logMAR [logarithm of the minimum angle of resolution] for statistical calculation) and central retinal thickness at baseline, 4 weeks post-loading phase, and at 1 year. Results: There were 56 and 100 patients with central and branch retinal vein occlusions, respectively, of whom 62% had chronic edema and received prior therapies and another 32% required additional laser treatments post-baseline bevacizumab. Baseline median visual acuity was 0.78 (interquartile range [IQR] 0.48–1.22) in the central group and 0.6 (IQR 0.3–0.78) in the branch group. In both groups, visual improvement was statistically significant from baseline compared to post-loading (P,0.001 and P=0.03, respectively), but was not significant by month 12 (P=0.058 and P=0.166, respectively); 30% improved by at least three lines and 44% improved by at least one line by month 12. Baseline median central retinal thickness was 449 μm (IQR 388–553) in the central group and 441 µm (IQR 357–501) in the branch group. However, the mean reduction in thickness was statistically significant at post-loading (P,0.001) and at the 12-month time point (P,0.001) for both groups. The average number of injections in 1 year was 4.2 in the central group and 3.3 in the branch group. Conclusion: Our large real-world cohort results indicate that bevacizumab introduced to patients with either new or chronic edema due to retinal vein occlusion can result in resolution of edema and stabilization of vision in the first year.
Resumo:
A szerző egy, a szennyezőanyag-kibocsátás európai kereskedelmi rendszerében megfelelésre kötelezett gázturbinás erőmű szén-dioxid-kibocsátását modellezi négy termékre (völgy- és csúcsidőszaki áramár, gázár, kibocsátási kvóta) vonatkozó reálopciós modell segítségével. A profitmaximalizáló erőmű csak abban az esetben termel és szennyez, ha a megtermelt áramon realizálható fedezete pozitív. A jövőbeli időszak összesített szén-dioxid-kibocsátása megfeleltethető európai típusú bináris különbözetopciók összegének. A modell keretein belül a szén-dioxid-kibocsátás várható értékét és sűrűségfüggvényét becsülhetjük, az utóbbi segítségével a szén-dioxid-kibocsátási pozíció kockáztatott értékét határozhatjuk meg, amely az erőmű számára előírt megfelelési kötelezettség teljesítésének adott konfidenciaszint melletti költségét jelenti. A sztochasztikus modellben az alaptermékek geometriai Ornstein-Uhlenbeck-folyamatot követnek. Ezt illesztette a szerző a német energiatőzsdéről származó publikus piaci adatokra. A szimulációs modellre támaszkodva megvizsgálta, hogy a különböző technológiai és piaci tényezők ceteris paribus megváltozása milyen hatással van a megfelelés költségére, a kockáztatott értékére. ______ The carbon-dioxide emissions of an EU Emissions Trading System participant, gas-fuelled power generator are modelled by using real options for four underlying instruments (peak and off-peak electricity, gas, emission quota). This profit-maximizing power plant operates and emits pollution only if its profit (spread) on energy produced is positive. The future emissions can be estimated by a sum of European binary-spread options. Based on the real-option model, the expected value of emissions and its probability-density function can be deducted. Also calculable is the Value at Risk of emission quota position, which gives the cost of compliance at a given confidence level. To model the prices of the four underlying instruments, the geometric Ornstein-Uhlenbeck process is supposed and matched to public available price data from EEX. Based on the simulation model, the effects of various technological and market factors are analysed for the emissions level and the cost of compliance.
Resumo:
The future power grid will effectively utilize renewable energy resources and distributed generation to respond to energy demand while incorporating information technology and communication infrastructure for their optimum operation. This dissertation contributes to the development of real-time techniques, for wide-area monitoring and secure real-time control and operation of hybrid power systems. ^ To handle the increased level of real-time data exchange, this dissertation develops a supervisory control and data acquisition (SCADA) system that is equipped with a state estimation scheme from the real-time data. This system is verified on a specially developed laboratory-based test bed facility, as a hardware and software platform, to emulate the actual scenarios of a real hybrid power system with the highest level of similarities and capabilities to practical utility systems. It includes phasor measurements at hundreds of measurement points on the system. These measurements were obtained from especially developed laboratory based Phasor Measurement Unit (PMU) that is utilized in addition to existing commercially based PMU’s. The developed PMU was used in conjunction with the interconnected system along with the commercial PMU’s. The tested studies included a new technique for detecting the partially islanded micro grids in addition to several real-time techniques for synchronization and parameter identifications of hybrid systems. ^ Moreover, due to numerous integration of renewable energy resources through DC microgrids, this dissertation performs several practical cases for improvement of interoperability of such systems. Moreover, increased number of small and dispersed generating stations and their need to connect fast and properly into the AC grids, urged this work to explore the challenges that arise in synchronization of generators to the grid and through introduction of a Dynamic Brake system to improve the process of connecting distributed generators to the power grid.^ Real time operation and control requires data communication security. A research effort in this dissertation was developed based on Trusted Sensing Base (TSB) process for data communication security. The innovative TSB approach improves the security aspect of the power grid as a cyber-physical system. It is based on available GPS synchronization technology and provides protection against confidentiality attacks in critical power system infrastructures. ^
Resumo:
Background: Depression is the largest contributing factor to years lost to disability, and symptom remission does not always result in functional improvement. Comprehensive analysis of functioning requires investigation both of the competence to perform behaviours, as well as actual performance in the real world. Further, two independent domains of functioning have been proposed: adaptive (behaviours conducive to daily living skills and independent functioning) and interpersonal (behaviours conducive to the successful initiation and maintenance of social relationships). To date, very little is known about the relationship between these constructs in depression, and the factors that may play a key role in the disparity between competence and real-world performance in adaptive and interpersonal functioning. Purpose: This study used a multidimensional (adaptive and interpersonal functioning), multi-level (competence and performance) approach to explore the potential discrepancy between competence and real-world performance in depression, specifically investigating whether self-efficacy (one’s beliefs of their capability to perform particular actions) predicts depressed individuals’ underperformance in the real world relative to their ability. A comparison sample of healthy participants was included to investigate the level of depressed individuals’ impairment, across variables, relative to healthy individuals. Method: Forty-two participants with depression and twenty healthy participants without history of, or current, psychiatric illness were recruited in the Kingston, Ontario community. Competence, self-efficacy, and real-world functioning all in both adaptive and interpersonal domains, and symptoms were assessed during a single-visit assessment. Results: Relative to healthy individuals, depressed individuals showed significantly poorer adaptive and interpersonal competence, adaptive and interpersonal functioning, and significantly lower self-efficacy for adaptive and interpersonal behaviours. Self-efficacy significantly predicted functional disability both in the domain of adaptive and interpersonal functioning. Interpersonal self-efficacy accounted for significant variance in the discrepancy between interpersonal competence and functioning. Conclusions: The current study provides the first data regarding relationships among competence, functioning, and self-efficacy in depression. Self-efficacy may play an important role in the deployment of functional skills in everyday life. This has implications for therapeutic interventions aimed at enhancing depressed individuals’ engagement in functional activities. There may be additional intrinsic or extrinsic factors that influence the relationships among competence and functioning in depression.
Resumo:
The goal of this project is to increase the amount of successful real estate license renewals while reducing the disruption caused by manual processing and calls for assistance with renewals and technical issues. The data utilized in this project will demonstrate that the Real Estate Commission renewal process can be improved by utilizing electronic resources such as more detailed website information and repeat e-mail notices, through modifications to the online renewal process to reduce applicant error, and by increasing the visibility of online renewal log-in instructions while decreasing the visibility and use of mail-in applications.
Resumo:
Networked control over data networks has received increasing attention in recent years. Among many problems in networked control systems (NCSs) is the need to reduce control latency and jitter and to deal with packet dropouts. This paper introduces our recent progress on a queuing communication architecture for real-time NCS applications, and simple strategies for dealing with packet dropouts. Case studies for a middle-scale process or multiple small-scale processes are presented for TCP/IP based real-time NCSs. Variations of network architecture design are modelled, simulated, and analysed for evaluation of control latency and jitter performance. It is shown that a simple bandwidth upgrade or adding hierarchy does not necessarily bring benefits for performance improvement of control latency and jitter. A co-design of network and control is necessary to maximise the real-time control performance of NCSs
Resumo:
Characterization of indoor particle sources from 14 residential houses in Brisbane, Australia, was performed. The approximation of PM2.5 and the submicrometre particle number concentrations were measured simultaneously for more than 48 h in the kitchen of all the houses by using a photometer (DustTrak) and a condensation particle counter (CPC), respectively. From the real time indoor particle concentration data and a diary of indoor activities, the indoor particle sources were identified. The study found that among the indoor activities recorded in this study, frying, grilling, stove use, toasting, cooking pizza, smoking, candle vaporizing eucalyptus oil and fan heater use, could elevate the indoor particle number concentration levels by more than five times. The indoor approximation of PM2.5 concentrations could be close to 90 times, 30 times and three times higher than the background levels during grilling, frying and smoking, respectively.
Resumo:
Many factors affect the airflow patterns, thermal comfort, contaminant removal efficiency and indoor air quality at individual workstations in office buildings. In this study, four ventilation systems were used in a test chamber designed to represent an area of a typical office building floor and reproduce the real characteristics of a modern office space. Measurements of particle concentration and thermal parameters (temperature and velocity) were carried out for each of the following types of ventilation systems: a) conventional air distribution system with ceiling supply and return; b) conventional air distribution system with ceiling supply and return near the floor; c) underfloor air distribution system; and d) split system. The measurements aimed to analyse the particle removal efficiency in the breathing zone and the impact of particle concentration on an individual at the workstation. The efficiency of the ventilation system was analysed by measuring particle size and concentration, ventilation effectiveness and the Indoor/Outdoor ratio. Each ventilation system showed different airflow patterns and the efficiency of each ventilation system in the removal of the particles in the breathing zone showed no correlation with particle size and the various methods of analyses used.
Resumo:
Housing affordability is gaining increasing prominence in the Australian socioeconomic landscape, despite strong economic growth and prosperity. It is a major consideration for any new development. However, it is multi-dimensional, has many facets, is complex and interwoven. One factor widely held to impact housing affordability is holding costs. Although it is only one contributor, the nature and extent of its impact requires clarification. It is certainly more multifarious than simple calculation of the interest or opportunity cost of land holding. For example, preliminary analysis suggests that even small shifts in the regulatory assessment period can significantly affect housing affordability. Other costs associated with “holding” also impact housing affordability, however these costs cannot always be easily identified. Nevertheless it can be said that ultimately the real impact is felt by those whom can least afford it - new home buyers whom can be relatively easily pushed into the realms of un-affordability.
Resumo:
This exhibition was the outcome of a personal arts-based exploration of the meaning of interiority. Through the process it was found that existentially the architectural wall differentiating inside from outside does not exist but operates as a space of overlap, a groundless ground providing for dwelling in the real existential sense of the word.
Resumo:
This paper explores how we may design located information and communication technologies (ICTs) to foster community sentiment. It focuses explicitly on possibilities for ICTs to create new modalities of place through exploring key factors such as shared experiences, shared knowledge and shared authorship. To contextualise this discussion in a real world setting, this paper presents FIGMENTUM, a situated generative art application that was developed for and installed in a new urban development. FIGMENTUM is a non-authoritative, non-service based application that aims to trigger emotional and representational place-based communities. Out of this practice-led research comes a theory and a process for designing creative place-based ICT’s to animate our urban communities.
Resumo:
There is a growing literature (Arthur, Inkson, & Pringle, 1999; Collin & Young, 2000; Hall & Associates, 1996; Peiperl, Arthur,& Anand, 2002) about the changing workplace and the consequent changes to our understanding of the place of career in individuals’ lives (Richardson, 1996, 2000) - “Careers are becoming more varied and more difficult to manage for both individuals and organisations” (Arnold et al, 2005, p. 523). This chapter will present the background to the changes in the world of work and the changes which inevitably impact individuals’ careers. It will then focus on the close relationship between career development and lifelong learning and the importance of ongoing professional learning for individuals to maintain employability in a changing work world.