931 resultados para Goddard Space Flight Center. Mission Operations and Data Systems Directorate.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When a visual stimulus is continuously moved behind a small stationary window, the window appears displaced in the direction of motion of the stimulus. In this study we showed that the magnitude of this illusion is dependent on (i) whether a perceptual or visuomotor task is used for judging the location of the window, (ii) the directional signature of the stimulus, and (iii) whether or not there is a significant delay between the end of the visual presentation and the initiation of the localization measure. Our stimulus was a drifting sinusoidal grating windowed in space by a stationary, two-dimensional, Gaussian envelope (σ=1 cycle of sinusoid). Localization measures were made following either a short (200 ms) or long (4.2 s) post-stimulus delay. The visuomotor localization error was up to three times greater than the perceptual error for a short delay. However, the visuomotor and perceptual localization measures were similar for a long delay. Our results provide evidence in support of the hypothesis that separate cortical pathways exist for visual perception and visually guided action and that delayed actions rely on stored perceptual information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction projects are risky. However, the characteristics of the risk highly depend on the type of procurement being adopted for managing the project. A build-operate-transfer (BOT) project is recognized as one of the most risky project schemes. There are instances of project failure where a BOT scheme was employed. Ineffective rts are increasingly being managed using various risk management tools and techniques. However, application of those tools depends on the nature of the project, organization's policy, project management strategy, risk attitude of the project team members, and availability of the resources. Understanding of the contents and contexts of BOT projects, together with a thorough understanding of risk management tools and techniques, helps select processes of risk management for effective project implementation in a BOT scheme. This paper studies application of risk management tools and techniques in BOT projects through reviews of relevant literatures and develops a model for selecting risk management process for BOT projects. The application to BOT projects is considered from the viewpoints of the major project participants. Discussion is also made with regard to political risks. This study would contribute to the establishment of a framework for systematic risk management in BOT projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Manufacturing planning and control systems are fundamental to the successful operations of a manufacturing organisation. 10 order to improve their business performance, significant investment is made by companies into planning and control systems; however, not all companies realise the benefits sought Many companies continue to suffer from high levels of inventory, shortages, obsolete parts, poor resource utilisation and poor delivery performance. This thesis argues that the fit between the planning and control system and the manufacturing organisation is a crucial element of success. The design of appropriate control systems is, therefore, important. The different approaches to the design of manufacturing planning and control systems are investigated. It is concluded that there is no provision within these design methodologies to properly assess the impact of a proposed design on the manufacturing facility. Consequently, an understanding of how a new (or modified) planning and control system will perform in the context of the complete manufacturing system is unlikely to be gained until after the system has been implemented and is running. There are many modelling techniques available, however discrete-event simulation is unique in its ability to model the complex dynamics inherent in manufacturing systems, of which the planning and control system is an integral component. The existing application of simulation to manufacturing control system issues is limited: although operational issues are addressed, application to the more fundamental design of control systems is rarely, if at all, considered. The lack of a suitable simulation-based modelling tool does not help matters. The requirements of a simulation tool capable of modelling a host of different planning and control systems is presented. It is argued that only through the application of object-oriented principles can these extensive requirements be achieved. This thesis reports on the development of an extensible class library called WBS/Control, which is based on object-oriented principles and discrete-event simulation. The functionality, both current and future, offered by WBS/Control means that different planning and control systems can be modelled: not only the more standard implementations but also hybrid systems and new designs. The flexibility implicit in the development of WBS/Control supports its application to design and operational issues. WBS/Control wholly integrates with an existing manufacturing simulator to provide a more complete modelling environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - The main aim of the research is to shed light on the role of information and communication technology (ICT) in the logistics innovation process of small and medium-sized third party logistics providers (3PLs). Design/methodology/approach - A triangulated research strategy was designed using a combination of quantitative and qualitative methods. The former involved the use of a questionnaire survey of small and medium-sized Italian 3PLs with 153 usable responses received. The latter comprised a series of focus groups and the use of seven case studies. Findings - There is a relatively low level of ICT expenditure with few companies adopting formal technology investment strategies. The findings highlight the strategic importance of supply chain integration for 3PLs with companies that have embarked on an expansion of their service portfolios showing a higher level of both ICT usage and information integration. Lack of technology skills in the workforce is a major constraint on ICT adoption. Given the proliferation of logistics-related ICT tools and applications in recent years it has been difficult for small and medium-sized 3PLs to select appropriate applications. Research limitations/implications - The paper provides practical guidelines to researchers in the effective use of mixed-methods research based on the concept of methodological triangulation. In particular, it shows how questionnaire surveys, focus groups and case study analysis can be used in combination to provide insights into multi-faceted supply chain phenomena. It also identifies several potentially fruitful avenues for future research in this specific field. Practical implications - The paper's findings provide useful guidance for practitioners on the effective adoption of ICT as part of the logistics innovation process. The findings also provide support for ICT vendors in the design of ICT solutions that are aligned to the needs of small 3PLs. Originality/value - There is currently a paucity of research into the drivers and inhibitors of ICT in the innovation processes of small and medium-sized 3PLs. This paper fills this gap by exploring the issue using a range of complementary research approaches. Copyright © 2013 Emerald Group Publishing Limited. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MEG beamformer algorithms work by making the assumption that correlated and spatially distinct local field potentials do not develop in the human brain. Despite this assumption, images produced by such algorithms concur with those from other non-invasive and invasive estimates of brain function. In this paper we set out to develop a method that could be applied to raw MEG data to explicitly test his assumption. We show that a promax rotation of MEG channel data can be used as an approximate estimator of the number of spatially distinct correlated sources in any frequency band.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrical energy is an essential resource for the modern world. Unfortunately, its price has almost doubled in the last decade. Furthermore, energy production is also currently one of the primary sources of pollution. These concerns are becoming more important in data-centers. As more computational power is required to serve hundreds of millions of users, bigger data-centers are becoming necessary. This results in higher electrical energy consumption. Of all the energy used in data-centers, including power distribution units, lights, and cooling, computer hardware consumes as much as 80%. Consequently, there is opportunity to make data-centers more energy efficient by designing systems with lower energy footprint. Consuming less energy is critical not only in data-centers. It is also important in mobile devices where battery-based energy is a scarce resource. Reducing the energy consumption of these devices will allow them to last longer and re-charge less frequently. Saving energy in computer systems is a challenging problem. Improving a system's energy efficiency usually comes at the cost of compromises in other areas such as performance or reliability. In the case of secondary storage, for example, spinning-down the disks to save energy can incur high latencies if they are accessed while in this state. The challenge is to be able to increase the energy efficiency while keeping the system as reliable and responsive as before. This thesis tackles the problem of improving energy efficiency in existing systems while reducing the impact on performance. First, we propose a new technique to achieve fine grained energy proportionality in multi-disk systems; Second, we design and implement an energy-efficient cache system using flash memory that increases disk idleness to save energy; Finally, we identify and explore solutions for the page fetch-before-update problem in caching systems that can: (a) control better I/O traffic to secondary storage and (b) provide critical performance improvement for energy efficient systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation. In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data. For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I proposed the study of two distinct aspects of Ten-Eleven Translocation 2 (TET2) protein for understanding specific functions in different body systems. In Part I, I characterized the molecular mechanisms of Tet2 in the hematological system. As the second member of Ten-Eleven Translocation protein family, TET2 is frequently mutated in leukemic patients. Previous studies have shown that the TET2 mutations frequently occur in 20% myelodysplastic syndrome/myeloproliferative neoplasm (MDS/MPN), 10% T-cell lymphoma leukemia and 2% B-cell lymphoma leukemia. Genetic mouse models also display distinct phenotypes of various types of hematological malignancies. I performed 5-hydroxymethylcytosine (5hmC) chromatin immunoprecipitation sequencing (ChIP-Seq) and RNA sequencing (RNA-Seq) of hematopoietic stem/progenitor cells to determine whether the deletion of Tet2 can affect the abundance of 5hmC at myeloid, T-cell and B-cell specific gene transcription start sites, which ultimately result in various hematological malignancies. Subsequent Exome sequencing (Exome-Seq) showed that disease-specific genes are mutated in different types of tumors, which suggests that TET2 may protect the genome from being mutated. The direct interaction between TET2 and Mutator S Homolog 6 (MSH6) protein suggests TET2 is involved in DNA mismatch repair. Finally, in vivo mismatch repair studies show that the loss of Tet2 causes a mutator phenotype. Taken together, my data indicate that TET2 binds to MSH6 to protect genome integrity. In Part II, I intended to better understand the role of Tet2 in the nervous system. 5-hydroxymethylcytosine regulates epigenetic modification during neurodevelopment and aging. Thus, Tet2 may play a critical role in regulating adult neurogenesis. To examine the physiological significance of Tet2 in the nervous system, I first showed that the deletion of Tet2 reduces the 5hmC levels in neural stem cells. Mice lacking Tet2 show abnormal hippocampal neurogenesis along with 5hmC alternations at different gene promoters and corresponding gene expression downregulation. Through the luciferase reporter assay, two neural factors Neurogenic differentiation 1 (NeuroD1) and Glial fibrillary acidic protein (Gfap) were down-regulated in Tet2 knockout cells. My results suggest that Tet2 regulates neural stem/progenitor cell proliferation and differentiation in adult brain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this research was to develop a methodology for transforming and dynamically segmenting data. Dynamic segmentation enables transportation system attributes and associated data to be stored in separate tables and merged when a specific query requires a particular set of data to be considered. A major benefit of dynamic segmentation is that individual tables can be more easily updated when attributes, performance characteristics, or usage patterns change over time. Applications of a progressive geographic database referencing system in transportation planning are vast. Summaries of system condition and performance can be made, and analyses of specific portions of a road system are facilitated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract. Two ideas taken from Bayesian optimization and classifier systems are presented for personnel scheduling based on choosing a suitable scheduling rule from a set for each person's assignment. Unlike our previous work of using genetic algorithms whose learning is implicit, the learning in both approaches is explicit, i.e. we are able to identify building blocks directly. To achieve this target, the Bayesian optimization algorithm builds a Bayesian network of the joint probability distribution of the rules used to construct solutions, while the adapted classifier system assigns each rule a strength value that is constantly updated according to its usefulness in the current situation. Computational results from 52 real data instances of nurse scheduling demonstrate the success of both approaches. It is also suggested that the learning mechanism in the proposed approaches might be suitable for other scheduling problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this document, we wish to describe statistics, data and the importance of the 13th CONTECSI – International Conference on Information Systems and Technology Management, which took place in the University of São Paulo, from June 1st through 3rd and was organized by TECSI/EAC/FEA/USP/ECA/POLI. This report presents statistics of the 13th CONTECSI, Goals and Objectives, Program, Plenary Sessions, Doctoral Consortium, Parallel Sessions, Honorable Mentions and Committees. We would like to point out the huge importance of the financial aid given by CAPES, CNPq, FAPESP, as well as the support of FEA USP, POLI USP, ECA USP, ANPAD, AIS, ISACA, UNINOVE, Mackenzie, Universidade do Porto, Rutgers School/USA, São Paulo Convention Bureau and CCINT-FEA-USP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract. Two ideas taken from Bayesian optimization and classifier systems are presented for personnel scheduling based on choosing a suitable scheduling rule from a set for each person's assignment. Unlike our previous work of using genetic algorithms whose learning is implicit, the learning in both approaches is explicit, i.e. we are able to identify building blocks directly. To achieve this target, the Bayesian optimization algorithm builds a Bayesian network of the joint probability distribution of the rules used to construct solutions, while the adapted classifier system assigns each rule a strength value that is constantly updated according to its usefulness in the current situation. Computational results from 52 real data instances of nurse scheduling demonstrate the success of both approaches. It is also suggested that the learning mechanism in the proposed approaches might be suitable for other scheduling problems.