846 resultados para Flexible Design Framework for Airport (FlexDFA)
Resumo:
Common goals in epidemiologic studies of infectious diseases include identification of the infectious agent, description of the modes of transmission and characterization of factors that influence the probability of transmission from infected to uninfected individuals. In the case of AIDS, the agent has been identified as the Human Immunodeficiency Virus (HIV), and transmission is known to occur through a variety of contact mechanisms including unprotected sexual intercourse, transfusion of infected blood products and sharing of needles in intravenous drug use. Relatively little is known about the probability of IV transmission associated with the various modes of contact, or the role that other cofactors play in promoting or suppressing transmission. Here, transmission probability refers to the probability that the virus is transmitted to a susceptible individual following exposure consisting of a series of potentially infectious contacts. The infectivity of HIV for a given route of transmission is defined to be the per contact probability of infection. Knowledge of infectivity and its relationship to other factors is important in understanding the dynamics of the AIDS epidemic and in suggesting appropriate measures to control its spread. The primary source of empirical data about infectivity comes from sexual partners of infected individuals. Partner studies consist of a series of such partnerships, usually heterosexual and monogamous, each composed of an initially infected "index case" and a partner who may or may not be infected by the time of data collection. However, because the infection times of both partners may be unknown and the history of contacts uncertain, any quantitative characterization of infectivity is extremely difficult. Thus, most statistical analyses of partner study data involve the simplifying assumption that infectivity is a constant common to all partnerships. The major objectives of this work are to describe and discuss the design and analysis of partner studies, providing a general statistical framework for investigations of infectivity and risk factors for HIV transmission. The development is largely based on three papers: Jewell and Shiboski (1990), Kim and Lagakos (1990), and Shiboski and Jewell (1992).
Resumo:
Currently photon Monte Carlo treatment planning (MCTP) for a patient stored in the patient database of a treatment planning system (TPS) can usually only be performed using a cumbersome multi-step procedure where many user interactions are needed. This means automation is needed for usage in clinical routine. In addition, because of the long computing time in MCTP, optimization of the MC calculations is essential. For these purposes a new graphical user interface (GUI)-based photon MC environment has been developed resulting in a very flexible framework. By this means appropriate MC transport methods are assigned to different geometric regions by still benefiting from the features included in the TPS. In order to provide a flexible MC environment, the MC particle transport has been divided into different parts: the source, beam modifiers and the patient. The source part includes the phase-space source, source models and full MC transport through the treatment head. The beam modifier part consists of one module for each beam modifier. To simulate the radiation transport through each individual beam modifier, one out of three full MC transport codes can be selected independently. Additionally, for each beam modifier a simple or an exact geometry can be chosen. Thereby, different complexity levels of radiation transport are applied during the simulation. For the patient dose calculation, two different MC codes are available. A special plug-in in Eclipse providing all necessary information by means of Dicom streams was used to start the developed MC GUI. The implementation of this framework separates the MC transport from the geometry and the modules pass the particles in memory; hence, no files are used as the interface. The implementation is realized for 6 and 15 MV beams of a Varian Clinac 2300 C/D. Several applications demonstrate the usefulness of the framework. Apart from applications dealing with the beam modifiers, two patient cases are shown. Thereby, comparisons are performed between MC calculated dose distributions and those calculated by a pencil beam or the AAA algorithm. Interfacing this flexible and efficient MC environment with Eclipse allows a widespread use for all kinds of investigations from timing and benchmarking studies to clinical patient studies. Additionally, it is possible to add modules keeping the system highly flexible and efficient.
Resumo:
The goal of this research is to provide a framework for vibro-acoustical analysis and design of a multiple-layer constrained damping structure. The existing research on damping and viscoelastic damping mechanism is limited to the following four mainstream approaches: modeling techniques of damping treatments/materials; control through the electrical-mechanical effect using the piezoelectric layer; optimization by adjusting the parameters of the structure to meet the design requirements; and identification of the damping material’s properties through the response of the structure. This research proposes a systematic design methodology for the multiple-layer constrained damping beam giving consideration to vibro-acoustics. A modeling technique to study the vibro-acoustics of multiple-layered viscoelastic laminated beams using the Biot damping model is presented using a hybrid numerical model. The boundary element method (BEM) is used to model the acoustical cavity whereas the Finite Element Method (FEM) is the basis for vibration analysis of the multiple-layered beam structure. Through the proposed procedure, the analysis can easily be extended to other complex geometry with arbitrary boundary conditions. The nonlinear behavior of viscoelastic damping materials is represented by the Biot damping model taking into account the effects of frequency, temperature and different damping materials for individual layers. A curve-fitting procedure used to obtain the Biot constants for different damping materials for each temperature is explained. The results from structural vibration analysis for selected beams agree with published closed-form results and results for the radiated noise for a sample beam structure obtained using a commercial BEM software is compared with the acoustical results of the same beam with using the Biot damping model. The extension of the Biot damping model is demonstrated to study MDOF (Multiple Degrees of Freedom) dynamics equations of a discrete system in order to introduce different types of viscoelastic damping materials. The mechanical properties of viscoelastic damping materials such as shear modulus and loss factor change with respect to different ambient temperatures and frequencies. The application of multiple-layer treatment increases the damping characteristic of the structure significantly and thus helps to attenuate the vibration and noise for a broad range of frequency and temperature. The main contributions of this dissertation include the following three major tasks: 1) Study of the viscoelastic damping mechanism and the dynamics equation of a multilayer damped system incorporating the Biot damping model. 2) Building the Finite Element Method (FEM) model of the multiple-layer constrained viscoelastic damping beam and conducting the vibration analysis. 3) Extending the vibration problem to the Boundary Element Method (BEM) based acoustical problem and comparing the results with commercial simulation software.
Resumo:
The objective of this research was to develop a high-fidelity dynamic model of a parafoilpayload system with respect to its application for the Ship Launched Aerial Delivery System (SLADS). SLADS is a concept in which cargo can be transfered from ship to shore using a parafoil-payload system. It is accomplished in two phases: An initial towing phase when the glider follows the towing vessel in a passive lift mode and an autonomous gliding phase when the system is guided to the desired point. While many previous researchers have analyzed the parafoil-payload system when it is released from another airborne vehicle, limited work has been done in the area of towing up the system from ground or sea. One of the main contributions of this research was the development of a nonlinear dynamic model of a towed parafoil-payload system. After performing an extensive literature review of the existing methods of modeling a parafoil-payload system, a five degree-of-freedom model was developed. The inertial and geometric properties of the system were investigated to predict accurate results in the simulation environment. Since extensive research has been done in determining the aerodynamic characteristics of a paraglider, an existing aerodynamic model was chosen to incorporate the effects of air flow around the flexible paraglider wing. During the towing phase, it is essential that the parafoil-payload system follow the line of the towing vessel path to prevent an unstable flight condition called ‘lockout’. A detailed study of the causes of lockout, its mathematical representation and the flight conditions and the parameters related to lockout, constitute another contribution of this work. A linearized model of the parafoil-payload system was developed and used to analyze the stability of the system about equilibrium conditions. The relationship between the control surface inputs and the stability was investigated. In addition to stability of flight, one more important objective of SLADS is to tow up the parafoil-payload system as fast as possible. The tension in the tow cable is directly proportional to the rate of ascent of the parafoil-payload system. Lockout instability is more favorable when tow tensions are large. Thus there is a tradeoff between susceptibility to lockout and rapid deployment. Control strategies were also developed for optimal tow up and to maintain stability in the event of disturbances.
Resumo:
Students are now involved in a vastly different textual landscape than many English scholars, one that relies on the “reading” and interpretation of multiple channels of simultaneous information. As a response to these new kinds of literate practices, my dissertation adds to the growing body of research on multimodal literacies, narratology in new media, and rhetoric through an examination of the place of video games in English teaching and research. I describe in this dissertation a hybridized theoretical basis for incorporating video games in English classrooms. This framework for textual analysis includes elements from narrative theory in literary study, rhetorical theory, and literacy theory, and when combined to account for the multiple modalities and complexities of gaming, can provide new insights about those theories and practices across all kinds of media, whether in written texts, films, or video games. In creating this framework, I hope to encourage students to view texts from a meta-level perspective, encompassing textual construction, use, and interpretation. In order to foster meta-level learning in an English course, I use specific theoretical frameworks from the fields of literary studies, narratology, film theory, aural theory, reader-response criticism, game studies, and multiliteracies theory to analyze a particular video game: World of Goo. These theoretical frameworks inform pedagogical practices used in the classroom for textual analysis of multiple media. Examining a video game from these perspectives, I use analytical methods from each, including close reading, explication, textual analysis, and individual elements of multiliteracies theory and pedagogy. In undertaking an in-depth analysis of World of Goo, I demonstrate the possibilities for classroom instruction with a complex blend of theories and pedagogies in English courses. This blend of theories and practices is meant to foster literacy learning across media, helping students develop metaknowledge of their own literate practices in multiple modes. Finally, I outline a design for a multiliteracies course that would allow English scholars to use video games along with other texts to interrogate texts as systems of information. In doing so, students can hopefully view and transform systems in their own lives as audiences, citizens, and workers.
Resumo:
Hybrid MIMO Phased-Array Radar (HMPAR) is an emerging technology that combines MIMO (multiple-in, multiple-out) radar technology with phased-array radar technology. The new technology is in its infancy, but much of the theoretical work for this specific project has already been completed and is explored in great depth in [1]. A brief overview of phased-array radar systems, MIMO radar systems, and the HMPAR paradigm are explored in this paper. This report is the culmination of an effort to support research in MIMO and HMPAR utilizing a concept called intrapulse beamscan. Using intrapulse beamscan, arbitrary spatial coverage can be achieved within one MIMO beam pulse. Therefore, this report focuses on designing waveforms for MIMO radar systems with arbitrary spatial coverage using that phenomenon. With intrapulse beamscan, scanning is done through phase-modulated signal design within one pulse rather than phase-shifters in the phased array over multiple pulses. In addition to using this idea, continuous phase modulation (CPM) signals are considered for their desirable peak-to-average ratio property as well as their low spectral leakage. These MIMO waveforms are designed with three goals in mind. The first goal is to achieve flexible spatial coverage while utilizing intrapulse beamscan. As with almost any radar system, we wish to have flexibility in where we send our signal energy. The second goal is to maintain a peak-to-average ratio close to 1 on the envelope of these waveforms, ensuring a signal that is close to constant modulus. It is desired to have a radar system transmit at the highest available power; not doing so would further diminish the already very small return signals. The third goal is to ensure low spectral leakage using various techniques to limit the bandwidth of the designed signals. Spectral containment is important to avoid interference with systems that utilize nearby frequencies in the electromagnetic spectrum. These three goals are realized allowing for limitations of real radar systems. In addition to flexible spatial coverage, the report examines the spectral properties of utilizing various space-filling techniques for desired spatial areas. The space-filling techniques examined include Hilbert/Peano curves and standard raster scans.
Resumo:
During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.
Resumo:
Neuromorphic computing has become an emerging field in wide range of applications. Its challenge lies in developing a brain-inspired architecture that can emulate human brain and can work for real time applications. In this report a flexible neural architecture is presented which consists of 128 X 128 SRAM crossbar memory and 128 spiking neurons. For Neuron, digital integrate and fire model is used. All components are designed in 45nm technology node. The core can be configured for certain Neuron parameters, Axon types and synapses states and are fully digitally implemented. Learning for this architecture is done offline. To train this circuit a well-known algorithm Restricted Boltzmann Machine (RBM) is used and linear classifiers are trained at the output of RBM. Finally, circuit was tested for handwritten digit recognition application. Future prospects for this architecture are also discussed.
Resumo:
This paper describes the open source framework MARVIN for rapid application development in the field of biomedical and clinical research. MARVIN applications consist of modules that can be plugged together in order to provide the functionality required for a specific experimental scenario. Application modules work on a common patient database that is used to store and organize medical data as well as derived data. MARVIN provides a flexible input/output system with support for many file formats including DICOM, various 2D image formats and surface mesh data. Furthermore, it implements an advanced visualization system and interfaces to a wide range of 3D tracking hardware. Since it uses only highly portable libraries, MARVIN applications run on Unix/Linux, Mac OS X and Microsoft Windows.
Resumo:
The authors describe the design, fabrication, and testing of a passive wireless sensor platform utilizing low-cost commercial surface acoustic wave filters and sensors. Polyimide and polyethylene terephthalate sheets are used as substrates to create a flexible sensor tag that can be applied to curved surfaces. A microfabricated antenna is integrated on the substrate in order to create a compact form factor. The sensor tags are fabricated using 315 MHz surface acoustic wave filters and photodiodes and tested with the aid of a fiber-coupled tungsten lamp. Microwave energy transmitted from a network analyzer is used to interrogate the sensor tag. Due to an electrical impedance mismatch at the SAW filter and sensor, energy is reflected at the sensor load and reradiated from the integrated antenna. By selecting sensors that change electrical impedance based on environmental conditions, the sensor state can be inferred through measurement of the reflected energy profile. Testing has shown that a calibrated system utilizing this type of sensor tag can detect distinct light levels wireless and passively. The authors also demonstrate simultaneous operation of two tags with different center passbands that detects light. Ranging tests show that the sensor tags can operate at a distance of at least 3.6 m.
Resumo:
Incorporating physical activity and exertion into pervasive gaming applications can provide health and social benefits. Prior research has resulted in several prototypes of pervasive games that encourage exertion as interaction form; however, no detailed critical account of the various approaches exists. We focus on networked exertion games and detail some of our work while identifying the remaining issues towards providing a coherent framework. We outline common lessons learned and use them as the basis for generalizations for the design of networked exertion games. We propose possible directions of further investigation, hoping to provide guidance for future work to facilitate greater awareness and exposure of exertion games and their benefits.
Resumo:
Component commonality - the use of the same version of a component across multiple products - is being increasingly considered as a promising way to offer high external variety while retaining low internal variety in operations. However, increasing commonality has both positive and negative cost effects, so that optimization approaches are required to identify an optimal commonality level. As components influence to a greater or lesser extent nearly every process step along the supply chain, it is not surprising that a multitude of diverging commonality problems is being investigated in literature, each of which are developing a specific algorithm designed for the respective commonality problem being considered. The paper on hand aims at a general framework which is flexible and efficient enough to be applied to a wide range of commonality problems. Such a procedure based on a two-stage graph approach is presented and tested. Finally, flexibility of the procedure is shown by customizing the framework to account for different types of commonality problems.
Resumo:
This paper presents the results of a comprehensive literature review of the organization of purchasing covering the period from 1967 to 2009. The review provides a structured overview of prior research topics and findings and identifies gaps in the existing literature that may be addressed in future research. The intention of the review is to a) synthesize prior research, b) provide researchers with a structural framework on which future research on the organization of purchasing may be oriented, and c) suggest promising areas for future research.
Resumo:
Mit der Idee eines generischen, an vielfältige Hochschulanforderungen anpassbaren Studierenden-App-Frameworks haben sich innerhalb des Arbeitskreises Web der ZKI ca. 30 Hochschulen zu einem Entwicklungsverbund zusammengefunden. Ziel ist es, an den beteiligten Einrichtungen eine umfassende Zusammenstellung aller elektronischen Studienservices zu evaluieren, übergreifende Daten- und Metadatenmodelle für die Beschreibung dieser Dienste zu erstellen und Schnittstellen zu den gängigen Campusmanagementsystemen sowie zu Infrastrukturen der elektronischen Lehre (LMS, Druckdienste, elektronischen Katalogen usw.) zu entwickeln. In einem abschließenden Schritt werden auf dieser Middleware aufsetzende Studienmanagement-Apps für Studierende erstellt, die die verschiedenen Daten- und Kommunikationsströme der standardisierten Dienste und Kommunikationskanäle bündeln und in eine für den Studierenden leicht zu durchschauende, navigationsfreundliche Aufbereitung kanalisiert. Mit der Konzeption eines dezentralen, über eine Vielzahl von Hochschulen verteilten Entwicklungsprojektes unter einer zentralen Projektleitung wird sichergestellt, dass redundante Entwicklungen vermieden, bundesweit standardisierte Serviceangebote angeboten und Wissenstransferprozesse zwischen einer Vielzahl von Hochschulen zur Nutzung mobiler Devices (Smartphones, Tablets und entsprechende Apps) angeregt werden können. Die Unterstützung der Realisierung klarer Schnittstellenspezifikationen zu Campusmanagementsystemen durch deren Anbieter kann durch diese breite Interessensgemeinschaft ebenfalls gestärkt werden. Weiterhin zentraler Planungsinhalt ist ein Angebot für den App-Nutzer zum Aufbau eines datenschutzrechtlich integeren, persönlichen E-Portfolios. Details finden sich im Kapitel Projektziele weiter unten.