263 resultados para Applied general equilibrium


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the emergence of multi-core processors into the mainstream, parallel programming is no longer the specialized domain it once was. There is a growing need for systems to allow programmers to more easily reason about data dependencies and inherent parallelism in general purpose programs. Many of these programs are written in popular imperative programming languages like Java and C]. In this thesis I present a system for reasoning about side-effects of evaluation in an abstract and composable manner that is suitable for use by both programmers and automated tools such as compilers. The goal of developing such a system is to both facilitate the automatic exploitation of the inherent parallelism present in imperative programs and to allow programmers to reason about dependencies which may be limiting the parallelism available for exploitation in their applications. Previous work on languages and type systems for parallel computing has tended to focus on providing the programmer with tools to facilitate the manual parallelization of programs; programmers must decide when and where it is safe to employ parallelism without the assistance of the compiler or other automated tools. None of the existing systems combine abstraction and composition with parallelization and correctness checking to produce a framework which helps both programmers and automated tools to reason about inherent parallelism. In this work I present a system for abstractly reasoning about side-effects and data dependencies in modern, imperative, object-oriented languages using a type and effect system based on ideas from Ownership Types. I have developed sufficient conditions for the safe, automated detection and exploitation of a number task, data and loop parallelism patterns in terms of ownership relationships. To validate my work, I have applied my ideas to the C] version 3.0 language to produce a language extension called Zal. I have implemented a compiler for the Zal language as an extension of the GPC] research compiler as a proof of concept of my system. I have used it to parallelize a number of real-world applications to demonstrate the feasibility of my proposed approach. In addition to this empirical validation, I present an argument for the correctness of the type system and language semantics I have proposed as well as sketches of proofs for the correctness of the sufficient conditions for parallelization proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuous passive motion (CPM) is currently a part of patient rehabilitation regimens after a variety of orthopedic surgical procedures. While CPM can enhance the joint healing process, the direct effects of CPM on cartilage metabolism remain unknown. Recent in vivo and in vitro observations suggest that mechanical stimuli can regulate articular cartilage metabolism of proteoglycan 4 (PRG4), a putative lubricating and chondroprotective molecule found in synovial fluid and at the articular cartilage surface. ----- ----- Objectives: (1) Determine the topographical variation in intrinsic cartilage PRG4 secretion. (2) Apply a CPM device to whole joints in bioreactors and assess effects of CPM on PRG4 biosynthesis.----- ----- Methods: A bioreactor was developed to apply CPM to bovine stifle joints in vitro. Effects of 24 h of CPM on PRG4 biosynthesis were determined.----- ----- Results: PRG4 secretion rate varied markedly over the joint surface. Rehabilitative joint motion applied in the form of CPM regulated PRG4 biosynthesis, in a manner dependent on the duty cycle of cartilage sliding against opposing tissues. Specifically, in certain regions of the femoral condyle that were continuously or intermittently sliding against meniscus and tibial cartilage during CPM, chondrocyte PRG4 synthesis was higher with CPM than without.----- ----- Conclusions: Rehabilitative joint motion, applied in the form of CPM, stimulates chondrocyte PRG4 metabolism. The stimulation of PRG4 synthesis is one mechanism by which CPM may benefit cartilage and joint health in post-operative rehabilitation. (C) 2006 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interaction Design is a fast developing branch of Industrial Design. The availability of cheap microprocessors and sensor electronics allow interactions between people and products that were until recently impossible. This has added additional layers of complexity to the design process. Novice designers find it difficult to effectively juggle these complexities and typically tend to focus on one aspect at a time. They also tend to take a linear, step-by-step approach to the design process in contrast to expert designers who pursue “parallel lines of thought” whilst simultaneously co-evolving both problem and solution. (Lawson, 1993) This paper explores an approach that encourages designers (in this case novice designers) to take a parallel rather than linear approach to the design process. It also addresses the problem of social loafing that tends to occur in team activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interaction of 10-hydroxycamptothecine (HCPT) with DNA under pseudo-physiological conditions (Tris-HCl buffer of pH 7.4), using ethidium bromide (EB) dye as a probe, was investigated with the use of spectrofluorimetry, UV-vis spectrometry and viscosity measurement. The binding constant and binding number for HCPT with DNA were evaluated as (7.1 ± 0.5) × 104 M-1 and 1.1, respectively, by multivariate curve resolution-alternating least squares (MCR-ALS). Moreover, parallel factor analysis (PARAFAC) was applied to resolve the three-way fluorescence data obtained from the interaction system, and the concentration information for the three components of the system at equilibrium was simultaneously obtained. It was found that there was a cooperative interaction between the HCPT-DNA complex and EB, which produced a ternary complex of HCPT-DNA-EB. © 2011 Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Concern about skin cancer is a common reason for people from predominantly fair-skinned populations to present to primary care doctors. Objectives To examine the frequency and body-site distribution of malignant, pre-malignant and benign pigmented skin lesions excised in primary care. Methods This prospective study conducted in Queensland, Australia, included 154 primary care doctors. For all excised or biopsied lesions, doctors recorded the patient's age and sex, body site, level of patient pressure to excise, and the clinical diagnosis. Histological confirmation was obtained through pathology laboratories. Results Of 9650 skin lesions, 57·7% were excised in males and 75·0% excised in patients ≥50years. The most common diagnoses were basal cell carcinoma (BCC) (35·1%) and squamous cell carcinoma (SCC) (19·7%). Compared with the whole body, the highest densities for SCC, BCC and actinic keratoses were observed on chronically sun-exposed areas of the body including the face in males and females, the scalp and ears in males, and the hands in females. The density of BCC was also high on intermittently or rarely exposed body sites. Females, younger patients and patients with melanocytic naevi were significantly more likely to exert moderate/high levels of pressure on the doctor to excise. Conclusions More than half the excised lesions were skin cancer, which mostly occurred on the more chronically sun-exposed areas of the body. Information on the type and body-site distribution of skin lesions can aid in the diagnosis and planned management of skin cancer and other skin lesions commonly presented in primary care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION. Following anterior thoracoscopic instrumentation and fusion for the treatment of thoracic AIS, implant related complications have been reported as high as 20.8%. Currently the magnitudes of the forces applied to the spine during anterior scoliosis surgery are unknown. The aim of this study was to measure the segmental compressive forces applied during anterior single rod instrumentation in a series of adolescent idiopathic scoliosis patients. METHODS. A force transducer was designed, constructed and retrofitted to a surgical cable compression tool, routinely used to apply segmental compression during anterior scoliosis correction. Transducer output was continuously logged during the compression of each spinal joint, the output at completion converted to an applied compression force using calibration data. The angle between adjacent vertebral body screws was also measured on intra-operative frontal plane fluoroscope images taken both before and after each joint compression. The difference in angle between the two images was calculated as an estimate for the achieved correction at each spinal joint. RESULTS. Force measurements were obtained for 15 scoliosis patients (Aged 11-19 years) with single thoracic curves (Cobb angles 47˚- 67˚). In total, 95 spinal joints were instrumented. The average force applied for a single joint was 540 N (± 229 N)ranging between 88 N and 1018 N. Experimental error in the force measurement, determined from transducer calibration was ± 43 N. A trend for higher forces applied at joints close to the apex of the scoliosis was observed. The average joint correction angle measured by fluoroscope imaging was 4.8˚ (±2.6˚, range 0˚-12.6˚). CONCLUSION. This study has quantified in-vivo, the intra-operative correction forces applied by the surgeon during anterior single rod instrumentation. This data provides a useful contribution towards an improved understanding of the biomechanics of scoliosis correction. In particular, this data will be used as input for developing patient-specific finite element simulations of scoliosis correction surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although Australia is the world’s driest continent without the complication of international borders and a generally good governance reputation, its record of water governance is very poor. This chapter considers some of the potentially general lessons that might be derived for water governance. These include: the difficulties of delineatingwater rights; the apparent preference for creating property rights in unsustainable uses of water while failing to deliver basic water rights; the inter twining of carbon and water crises; the dangers of privatising networks that form natural monopolies; the dangers of disciplinary hubris where interdisciplinary understanding is critical. It concludes by starting to address some of the water governance issues raised by globalisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nationally, there is much legislation regulating land sale transactions, particularly in relation to seller disclosure of information. The statutes require strict compliance by a seller failing which, in general, a buyer can terminate the contract. In a number of instances, when buyers have sought to exercise these rights, sellers have alleged that buyers have either expressly or by conduct waived their rights to rely upon these statutes. This article examines the nature of these rights in this context, whether they are capable of waiver and, if so, what words or conduct might be sufficient to amount to waiver. The analysis finds that the law is in a very unsatisfactory state, that the operation of those rules that can be identified as having relevance are unevenly applied and concludes that sellers have, in the main, been unsuccessful in defeating buyers' statutory rights as a result of an alleged waiver by those buyers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.