67 resultados para Context-aware systems
Resumo:
With the beginning of airline deregulations in 1978, U.S. domestic operations were in for a period of turmoil, adjustment, vibrancy, entrepreneurship, and change. A great deal has been written about the effects of deregulation on airlines and their personnel, and on the public at large. Less attention has been paid to the effects on travel agents and on the seminal role of computerized reservations systems (CRSs) in the flowering of travel agencies. This article examines both of these phenomena.
Resumo:
The maturation of the cruise industry has led to increased competition which demands more efficient operations. Systems engineering, a discipline that studies complex organizations of material, people, and information, is traditionally only applied in the manufacturing sector; however, it can make significant contributions to service industries such as the cruise industry. The author describes this type of engineering, explores how it can be applied to the cruise industry, and presents two case studies demonstrating applications to the cruise industry luggage delivery process and the information technology help desk process. The results show that this approach can make the processes more productive and enhance profitability for the cruise lines.
Resumo:
An electronic database support system for strategic planning activities can be built by providing conceptual and system specific information. The design and development of this type of system center around the information needs of strategy planners. Data that supply information on the organization's internal and external environments must be originated, evaluated, collected, organized, managed, and analyzed. Strategy planners may use the resulting information to improve their decision making.
Resumo:
Although there are more than 7,000 properties using lodging yield management systems (LYMSs), both practitioners and researchers alike have found it difficult to measure their success. Considerable research was performed in the 1980s to develop success measures for information systems in general. In this work the author develops success measures specifically for LYMSs.
Resumo:
Catering to society's demand for high performance computing, billions of transistors are now integrated on IC chips to deliver unprecedented performances. With increasing transistor density, the power consumption/density is growing exponentially. The increasing power consumption directly translates to the high chip temperature, which not only raises the packaging/cooling costs, but also degrades the performance/reliability and life span of the computing systems. Moreover, high chip temperature also greatly increases the leakage power consumption, which is becoming more and more significant with the continuous scaling of the transistor size. As the semiconductor industry continues to evolve, power and thermal challenges have become the most critical challenges in the design of new generations of computing systems. ^ In this dissertation, we addressed the power/thermal issues from the system-level perspective. Specifically, we sought to employ real-time scheduling methods to optimize the power/thermal efficiency of the real-time computing systems, with leakage/ temperature dependency taken into consideration. In our research, we first explored the fundamental principles on how to employ dynamic voltage scaling (DVS) techniques to reduce the peak operating temperature when running a real-time application on a single core platform. We further proposed a novel real-time scheduling method, “M-Oscillations” to reduce the peak temperature when scheduling a hard real-time periodic task set. We also developed three checking methods to guarantee the feasibility of a periodic real-time schedule under peak temperature constraint. We further extended our research from single core platform to multi-core platform. We investigated the energy estimation problem on the multi-core platforms and developed a light weight and accurate method to calculate the energy consumption for a given voltage schedule on a multi-core platform. Finally, we concluded the dissertation with elaborated discussions of future extensions of our research. ^
Resumo:
Trenchless methods have been considered to be a viable solution for pipeline projects in urban areas. Their applicability in pipeline projects is expected to increase with the rapid advancements in technology and emerging concerns regarding social costs related to trenching methods. Selecting appropriate project delivery system (PDS) is a key to the success of trenchless projects. To ensure success of the project, the selected project delivery should be tailored to trenchless project specific characteristics and owner needs, since the effectiveness of project delivery systems differs based on different project characteristics and owners requirements. Since different trenchless methods have specific characteristics such rate of installation, lengths of installation, and accuracy, the same project delivery systems may not be equally effective for different methods. The intent of this paper is to evaluate the appropriateness of different PDS for different trenchless methods. PDS are examined through a structured decision-making process called Fuzzy Delivery System Selection Model (FDSSM). The process of incorporating the impacts of: (a) the characteristics of trenchless projects and (b) owners’ needs in the FDSSM is performed by collecting data using questionnaires deployed to professionals involved in the trenchless industry in order to determine the importance of delivery systems selection attributes for different trenchless methods, and then analyzing this data. The sensitivity of PDS rankings with respect to trenchless methods is considered in order to evaluate whether similar project delivery systems are equally effective in different trenchless methods. The effectiveness of PDS with respect to attributes is defined as follows: a project delivery system is most effective with respect to an attribute (e.g., ability to control growth in costs ) if there is no project delivery system that is more effective than that PDS. The results of this study may assist trenchless project owners to select the appropriate PDS for the trenchless method selected.
Resumo:
Infrastructure systems are drivers of the economy in the nation. A dollar spent on infrastructure development yields roughly double the initial spending in ultimate economic output in the short term; and over a twenty-year period, and generalized ‘public investment’ produces an aggregated $3.21 of economic activity per $1.00 spent [1]. Thus, formulation of policies pertaining to infrastructure investment and development is of significance affecting the social and economic wellbeing of the nation. The aim of this policy brief is to evaluate innovative financing in infrastructure systems from two different perspectives: (1) through consideration of the current condition of infrastructure in the U.S., the current trends in public spending, and the emerging innovative financing tools; (2) through evaluation of the roles and interactions of different agencies in the creation and the diffusion of innovative financing tools. Then using the example of transportation financing, the policy brief provides an assessment of policy landscapes which could lead to the closure of infrastructure financing gap in the U.S and proposes strategies for citizen involvement to gain public support of innovative financing.
Resumo:
Mapping of vegetation patterns over large extents using remote sensing methods requires field sample collections for two different purposes: (1) the establishment of plant association classification systems from samples of relative abundance estimates; and (2) training for supervised image classification and accuracy assessment of satellite data derived maps. One challenge for both procedures is the establishment of confidence in results and the analysis across multiple spatial scales. Continuous data sets that enable cross-scale studies are very time consuming and expensive to acquire and such extensive field sampling can be invasive. The use of high resolution aerial photography (hrAP) offers an alternative to extensive, invasive, field sampling and can provide large volume, spatially continuous, reference information that can meet the challenges of confidence building and multi-scale analysis.
Resumo:
This poster presentation from the May 2015 Florida Library Association Conference, along with the Everglades Explorer discovery portal at http://ee.fiu.edu, demonstrates how traditional bibliographic and curatorial principles can be applied to: 1) selection, cross-walking and aggregation of metadata linking end-users to wide-spread digital resources from multiple silos; 2) harvesting of select PDFs, HTML and media for web archiving and access; 3) selection of CMS domains, sub-domains and folders for targeted searching using an API. Choosing content for this discovery portal is comparable to past scholarly practice of creating and publishing subject bibliographies, except metadata and data are housed in relational databases. This new and yet traditional capacity coincides with: Growth of bibliographic utilities (MarcEdit); Evolution of open-source discovery systems (eXtensible Catalog); Development of target-capable web crawling and archiving systems (Archive-it); and specialized search APIs (Google). At the same time, historical and technical changes – specifically the increasing fluidity and re-purposing of syndicated metadata – make this possible. It equally stems from the expansion of freely accessible digitized legacy and born-digital resources. Innovation principles helped frame the process by which the thematic Everglades discovery portal was created at Florida International University. The path -- to providing for more effective searching and co-location of digital scientific, educational and historical material related to the Everglades -- is contextualized through five concepts found within Dyer and Christensen’s “The Innovator’s DNA: Mastering the five skills of disruptive innovators (2011). The project also aligns with Ranganathan’s Laws of Library Science, especially the 4th Law -- to "save the time of the user.”
Resumo:
The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation. In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data. For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.
Resumo:
Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.
Resumo:
I proposed the study of two distinct aspects of Ten-Eleven Translocation 2 (TET2) protein for understanding specific functions in different body systems. In Part I, I characterized the molecular mechanisms of Tet2 in the hematological system. As the second member of Ten-Eleven Translocation protein family, TET2 is frequently mutated in leukemic patients. Previous studies have shown that the TET2 mutations frequently occur in 20% myelodysplastic syndrome/myeloproliferative neoplasm (MDS/MPN), 10% T-cell lymphoma leukemia and 2% B-cell lymphoma leukemia. Genetic mouse models also display distinct phenotypes of various types of hematological malignancies. I performed 5-hydroxymethylcytosine (5hmC) chromatin immunoprecipitation sequencing (ChIP-Seq) and RNA sequencing (RNA-Seq) of hematopoietic stem/progenitor cells to determine whether the deletion of Tet2 can affect the abundance of 5hmC at myeloid, T-cell and B-cell specific gene transcription start sites, which ultimately result in various hematological malignancies. Subsequent Exome sequencing (Exome-Seq) showed that disease-specific genes are mutated in different types of tumors, which suggests that TET2 may protect the genome from being mutated. The direct interaction between TET2 and Mutator S Homolog 6 (MSH6) protein suggests TET2 is involved in DNA mismatch repair. Finally, in vivo mismatch repair studies show that the loss of Tet2 causes a mutator phenotype. Taken together, my data indicate that TET2 binds to MSH6 to protect genome integrity. In Part II, I intended to better understand the role of Tet2 in the nervous system. 5-hydroxymethylcytosine regulates epigenetic modification during neurodevelopment and aging. Thus, Tet2 may play a critical role in regulating adult neurogenesis. To examine the physiological significance of Tet2 in the nervous system, I first showed that the deletion of Tet2 reduces the 5hmC levels in neural stem cells. Mice lacking Tet2 show abnormal hippocampal neurogenesis along with 5hmC alternations at different gene promoters and corresponding gene expression downregulation. Through the luciferase reporter assay, two neural factors Neurogenic differentiation 1 (NeuroD1) and Glial fibrillary acidic protein (Gfap) were down-regulated in Tet2 knockout cells. My results suggest that Tet2 regulates neural stem/progenitor cell proliferation and differentiation in adult brain.
Resumo:
This research analyzed the spatial relationship between a mega-scale fracture network and the occurrence of vegetation in an arid region. High-resolution aerial photographs of Arches National Park, Utah were used for digital image processing. Four sets of large-scale joints were digitized from the rectified color photograph in order to characterize the geospatial properties of the fracture network with the aid of a Geographic Information System. An unsupervised landcover classification was carried out to identify the spatial distribution of vegetation on the fractured outcrop. Results of this study confirm that the WNW-ESE alignment of vegetation is dominantly controlled by the spatial distribution of the systematic joint set, which in turn parallels the regional fold axis. This research provides insight into the spatial heterogeneity inherent to fracture networks, as well as the effects of jointing on the distribution of surface vegetation in desert environments.
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
Since the introduction of fiber reinforced polymers (FRP) for the repair and retrofit of concrete structures in the 1980’s, considerable research has been devoted to the feasibility of their application and predictive modeling of their performance. However, the effects of flaws present in the constitutive components and the practices in substrate preparation and treatment have not yet been thoroughly studied. This research aims at investigating the effect of surface preparation and treatment for the pre-cured FRP systems and the groove size tolerance for near surface mounted (NSM) FRP systems; and to set thresholds for guaranteed system performance. The research included both analytical and experimental components. The experimental program for the pre-cured FRP systems consisted of a total of twenty-four (24) reinforced concrete (RC) T-beams with various surface preparation parameters and surface flaws, including roughness, flatness, voids and cracks (cuts). For the NSM FRP systems, a total of twelve (12) additional RC T-beams were tested with different grooves sizes for FRP bars and strips. The analytical program included developing an elaborate nonlinear finite element model using the general purpose software ANSYS. The model was subsequently used to extend the experimental range of parameters for surface flatness in pre-cured FRP systems, and for groove size study in the NSM FRP systems. Test results, confirmed by further analyses, indicated that contrary to the general belief in the industry, the impact of surface roughness on the global performance of pre-cured FRP systems was negligible. The study also verified that threshold limits set for wet lay-up FRP systems can be extended to pre-cured systems. The study showed that larger surface voids and cracks (cuts) can adversely impact both the strength and ductility of pre-cured FRP systems. On the other hand, frequency (or spacing) of surface cracks (cuts) may only affect system ductility rather than its strength. Finally, within the range studied, groove size tolerance of +1/8 in. does not appear to have an adverse effect on the performance of NSM FRP systems.