918 resultados para Television -- Antennas -- Design and construction -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer game technology is poised to make a significant impact on the way our youngsters will learn. Our youngsters are ‘Digital Natives’, immersed in digital technologies, especially computer games. They expect to utilize these technologies in learning contexts. This expectation, and our response as educators, may change classroom practice and inform curriculum developments. This chapter approaches these issues ‘head on’. Starting from a review of the current educational issues, an evaluation of educational theory and instructional design principles, a new theoretical approach to the construction of “Educational Immersive Environments” (EIEs) is proposed. Elements of this approach are applied to development of an EIE to support Literacy Education in UK Primary Schools. An evaluation of a trial within a UK Primary School is discussed. Conclusions from both the theoretical development and the evaluation suggest how future teacher-practitioners may embrace both the technology and our approach to develop their own learning resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the development and evaluation of web-based museum trails for university-level design students to access on handheld devices in the Victoria and Albert Museum (V&A) in London. The trails offered students a range of ways of exploring the museum environment and collections, some encouraging students to interpret objects and museum spaces in lateral and imaginative ways, others more straightforwardly providing context and extra information. In a three-stage qualitative evaluation programme, student feedback showed that overall the trails enhanced students’ knowledge of, interest in, and closeness to the objects. However, the trails were only partially successful from a technological standpoint due to device and network problems. Broader findings suggest that technology has a key role to play in helping to maintain the museum as a learning space which complements that of universities as well as schools. This research informed my other work in visitor-constructed learning trails in museums, specifically in the theoretical approach to data analysis used, in the research design, and in informing ways to structure visitor experiences in museums. It resulted in a conference presentation, and more broadly informed my subsequent teaching practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Planning and objectives for various departments within the Department of Transportation for 1990

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Handbook has been prepared by the Iowa DOT as a guide and supplement to the MUTCD. It provides in one document a large number of illustrations which can be easily adapted to specific conditions by field personnel. It is intended to supersede all previous non-conforming standards now being used throughout the state and to provide uniform guidelines for all agencies, public and private, who must conduct construction and maintenance activities on the streets and highways of the state. The illustrations contained herein serve as a quick reference for field personnel to follow, however, no amount of detailed instructions can adequately cover every situation. For this reason, sound judgment is required in using these illustrations to cover actual field conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper carries the rather weighty title of "Evolution of Design Practice at the Iowa State Highway Commission for the Determination of Peak Discharges at .Bridges and Culverts." Hopefully, this evolving process will lead to a more precise definition of a peak rate of runoff for a selected recurrence interval at a particular site. In this paper the author will relate where the Highway Commission has been, is now, and will be going in this art of hydrology. He will then offer some examples at a few sites in Iowa to illustrate the use of the various methods. Finally, he will look ahead to some of the pitfalls still lying in wait for us.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The investigations for this report were initiated in October, 1967, to perform the following: l. Review the current Iowa State Highway Commission roadway geometric design standards and criteria for conformance with national policies and recent research findings with special attention to high way safety. 2. Review the current Iowa State Highway Commission roadway lighting design standards and criteria for conformance with national policies and recent research findings with special attention to high way safety

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design number of gyrations (Ndesign) introduced by the Strategic Highway Research Program (SHRP) and used in the Superior Performing Asphalt Pavement (Superpave) mix design method has been commonly used in flexible pavement design throughout the US since 1996. Ndesign, also known as the compaction effort, is used to simulate field compaction during construction and has been reported to produce air voids that are unable to reach ultimate pavement density within the initial 2 to 3 years post-construction, potentially having an adverse impact on long-term performance. Other state transportation agencies have conducted studies validating the Ndesign for their specific regions, which resulted in modifications of the gyration effort for the various traffic levels. Validating this relationship for Iowa asphalt mix designs will lead to better correlations between mix design target voids, field voids, and performance. A comprehensive analysis of current Ndesign levels investigated the current levels with existing mixes and pavements and developed initial asphalt mix design recommendations that identify an optimum Ndesign through the use of performance data tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Portland cement concrete (PCC) pavement undergoes repeated environmental load-related deflection resulting from temperature and moisture variations across the pavement depth. This phenomenon, referred to as PCC pavement curling and warping, has been known and studied since the mid-1920s. Slab curvature can be further magnified under repeated traffic loads and may ultimately lead to fatigue failures, including top-down and bottom-up transverse, longitudinal, and corner cracking. It is therefore important to measure the “true” degree of curling and warping in PCC pavements, not only for quality control (QC) and quality assurance (QA) purposes, but also to achieve a better understanding of its relationship to long-term pavement performance. In order to better understand the curling and warping behavior of PCC pavements in Iowa and provide recommendations to mitigate curling and warping deflections, field investigations were performed at six existing sites during the late fall of 2015. These sites included PCC pavements with various ages, slab shapes, mix design aspects, and environmental conditions during construction. A stationary light detection and ranging (LiDAR) device was used to scan the slab surfaces. The degree of curling and warping along the longitudinal, transverse, and diagonal directions was calculated for the selected slabs based on the point clouds acquired using LiDAR. The results and findings are correlated to variations in pavement performance, mix design, pavement design, and construction details at each site. Recommendations regarding how to minimize curling and warping are provided based on a literature review and this field study. Some examples of using point cloud data to build three-dimensional (3D) models of the overall curvature of the slab shape are presented to show the feasibility of using this 3D analysis method for curling and warping analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis discusses market design and regulation in electricity systems, focusing on the information exchange of the regulated grid firm and the generation firms as well as the regulation of the grid firm. In the first chapter, an economic framework is developed to consistently analyze different market designs and the information exchange between the grid firm and the generation firms. Perfect competition between the generation firms and perfect regulation of the grid firm is assumed. A numerical algorithm is developed and its feasibility demonstrated on a large-scale problem. The effects of different market designs for the Central Western European (CWE) region until 2030 are analyzed. In the second chapter, the consequences of restricted grid expansion within the current market design in the CWE region until 2030 are analyzed. In the third chapter the assumption of efficient markets is modified. The focus of the analysis is then, whether and how inefficiencies in information availability and processing affect different market designs. For different parameter settings, nodal and zonal pricing are compared regarding their welfare in the spot and forward market. In the fourth chapter, information asymmetries between the regulator and the regulated firm are analyzed. The optimal regulatory strategy for a firm, providing one output with two substitutable inputs, is defined. Thereby, one input and the absolute quantity of inputs is not observable for the regulator. The result is then compared to current regulatory approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2005, the University of Maryland acquired over 70 digital videos spanning 35 years of Jim Henson’s groundbreaking work in television and film. To support in-house discovery and use, the collection was cataloged in detail using AACR2 and MARC21, and a web-based finding aid was also created. In the past year, I created an "r-ball" (a linked data set described using RDA) of these same resources. The presentation will compare and contrast these three ways of accessing the Jim Henson Works collection, with insights gleaned from providing resource discovery using RIMMF (RDA in Many Metadata Formats).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scientific applications rely heavily on floating point data types. Floating point operations are complex and require complicated hardware that is both area and power intensive. The emergence of massively parallel architectures like Rigel creates new challenges and poses new questions with respect to floating point support. The massively parallel aspect of Rigel places great emphasis on area efficient, low power designs. At the same time, Rigel is a general purpose accelerator and must provide high performance for a wide class of applications. This thesis presents an analysis of various floating point unit (FPU) components with respect to Rigel, and attempts to present a candidate design of an FPU that balances performance, area, and power and is suitable for massively parallel architectures like Rigel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents approximation algorithms for some NP-Hard combinatorial optimization problems on graphs and networks; in particular, we study problems related to Network Design. Under the widely-believed complexity-theoretic assumption that P is not equal to NP, there are no efficient (i.e., polynomial-time) algorithms that solve these problems exactly. Hence, if one desires efficient algorithms for such problems, it is necessary to consider approximate solutions: An approximation algorithm for an NP-Hard problem is a polynomial time algorithm which, for any instance of the problem, finds a solution whose value is guaranteed to be within a multiplicative factor of the value of an optimal solution to that instance. We attempt to design algorithms for which this factor, referred to as the approximation ratio of the algorithm, is as small as possible. The field of Network Design comprises a large class of problems that deal with constructing networks of low cost and/or high capacity, routing data through existing networks, and many related issues. In this thesis, we focus chiefly on designing fault-tolerant networks. Two vertices u,v in a network are said to be k-edge-connected if deleting any set of k − 1 edges leaves u and v connected; similarly, they are k-vertex connected if deleting any set of k − 1 other vertices or edges leaves u and v connected. We focus on building networks that are highly connected, meaning that even if a small number of edges and nodes fail, the remaining nodes will still be able to communicate. A brief description of some of our results is given below. We study the problem of building 2-vertex-connected networks that are large and have low cost. Given an n-node graph with costs on its edges and any integer k, we give an O(log n log k) approximation for the problem of finding a minimum-cost 2-vertex-connected subgraph containing at least k nodes. We also give an algorithm of similar approximation ratio for maximizing the number of nodes in a 2-vertex-connected subgraph subject to a budget constraint on the total cost of its edges. Our algorithms are based on a pruning process that, given a 2-vertex-connected graph, finds a 2-vertex-connected subgraph of any desired size and of density comparable to the input graph, where the density of a graph is the ratio of its cost to the number of vertices it contains. This pruning algorithm is simple and efficient, and is likely to find additional applications. Recent breakthroughs on vertex-connectivity have made use of algorithms for element-connectivity problems. We develop an algorithm that, given a graph with some vertices marked as terminals, significantly simplifies the graph while preserving the pairwise element-connectivity of all terminals; in fact, the resulting graph is bipartite. We believe that our simplification/reduction algorithm will be a useful tool in many settings. We illustrate its applicability by giving algorithms to find many trees that each span a given terminal set, while being disjoint on edges and non-terminal vertices; such problems have applications in VLSI design and other areas. We also use this reduction algorithm to analyze simple algorithms for single-sink network design problems with high vertex-connectivity requirements; we give an O(k log n)-approximation for the problem of k-connecting a given set of terminals to a common sink. We study similar problems in which different types of links, of varying capacities and costs, can be used to connect nodes; assuming there are economies of scale, we give algorithms to construct low-cost networks with sufficient capacity or bandwidth to simultaneously support flow from each terminal to the common sink along many vertex-disjoint paths. We further investigate capacitated network design, where edges may have arbitrary costs and capacities. Given a connectivity requirement R_uv for each pair of vertices u,v, the goal is to find a low-cost network which, for each uv, can support a flow of R_uv units of traffic between u and v. We study several special cases of this problem, giving both algorithmic and hardness results. In addition to Network Design, we consider certain Traveling Salesperson-like problems, where the goal is to find short walks that visit many distinct vertices. We give a (2 + epsilon)-approximation for Orienteering in undirected graphs, achieving the best known approximation ratio, and the first approximation algorithm for Orienteering in directed graphs. We also give improved algorithms for Orienteering with time windows, in which vertices must be visited between specified release times and deadlines, and other related problems. These problems are motivated by applications in the fields of vehicle routing, delivery and transportation of goods, and robot path planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 2005, harmonized catch assessment surveys (CASs) have been implemented on Lake Victoria in the three riparian countries Uganda, Kenya, and Tanzania to monitor the commercial fish stocks and provide their management advice. The regionally harmonized standard operating procedures for CASs have not been wholly followed due to logistical difficulties. Yet the new approaches adopted have not been documented. This study investigated the alternative approaches used to estimate fish catches on the lake with the aim of determining the most reliable one for providing management advice and also the effect of current sampling routine on the precision of catch estimates provided. The study found the currently used lake-wide approach less reliable and more biased in providing catch estimates compared to the district based approach. Noticeable differences were detected in catch estimates between different months of the year. The study recommends future analyses of CAS data collected on the lake to follow the district based approach. Future CASs should also consider seasonal variations in the sampling design by providing for replication of sampling. The SOPs need updating to document the procedures that deviate from the original sampling design.