14 resultados para Construction industry Management Data processing

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The outcome of this research is an Intelligent Retrieval System for Conditions of Contract Documents. The objective of the research is to improve the method of retrieving data from a computer version of a construction Conditions of Contract document. SmartDoc, a prototype computer system has been developed for this purpose. The system provides recommendations to aid the user in the process of retrieving clauses from the construction Conditions of Contract document. The prototype system integrates two computer technologies: hypermedia and expert systems. Hypermedia is utilized to provide a dynamic way for retrieving data from the document. Expert systems technology is utilized to build a set of rules that activate the recommendations to aid the user during the process of retrieval of clauses. The rules are based on experts knowledge. The prototype system helps the user retrieve related clauses that are not explicitly cross-referenced but, according to expert experience, are relevant to the topic that the user is interested in.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the U.S., construction accidents remain a significant economic and social problem. Despite recent improvement, the Construction industry, generally, has lagged behind other industries in implementing safety as a total management process for achieving zero accidents and developing a high-performance safety culture. One aspect of this total approach to safety that has frustrated the construction industry the most has been “measurement”, which involves identifying and quantifying the factors that critically influence safe work behaviors. The basic problem attributed is the difficulty in assessing what to measure and how to measure it—particularly the intangible aspects of safety. Without measurement, the notion of continuous improvement is hard to follow. This research was undertaken to develop a strategic framework for the measurement and continuous improvement of total safety in order to achieve and sustain the goal of zero accidents, while improving the quality, productivity and the competitiveness of the construction industry as it moves forward. The research based itself on an integral model of total safety that allowed decomposition of safety into interior and exterior characteristics using a multiattribute analysis technique. Statistical relationships between total safety dimensions and safety performance (measured by safe work behavior) were revealed through a series of latent variables (factors) that describe the total safety environment of a construction organization. A structural equation model (SEM) was estimated for the latent variables to quantify relationships among them and between these total safety determinants and safety performance of a construction organization. The developed SEM constituted a strategic framework for identifying, measuring, and continuously improving safety as a total concern for achieving and sustaining the goal of zero accidents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In his discussion - Database As A Tool For Hospitality Management - William O'Brien, Assistant Professor, School of Hospitality Management at Florida International University, O’Brien offers at the outset, “Database systems offer sweeping possibilities for better management of information in the hospitality industry. The author discusses what such systems are capable of accomplishing.” The author opens with a bit of background on database system development, which also lends an impression as to the complexion of the rest of the article; uh, it’s a shade technical. “In early 1981, Ashton-Tate introduced dBase 11. It was the first microcomputer database management processor to offer relational capabilities and a user-friendly query system combined with a fast, convenient report writer,” O’Brien informs. “When 16-bit microcomputers such as the IBM PC series were introduced late the following year, more powerful database products followed: dBase 111, Friday!, and Framework. The effect on the entire business community, and the hospitality industry in particular, has been remarkable”, he further offers with his informed outlook. Professor O’Brien offers a few anecdotal situations to illustrate how much a comprehensive data-base system means to a hospitality operation, especially when billing is involved. Although attitudes about computer systems, as well as the systems themselves have changed since this article was written, there is pertinent, fundamental information to be gleaned. In regards to the digression of the personal touch when a customer is engaged with a computer system, O’Brien says, “A modern data processing system should not force an employee to treat valued customers as numbers…” He also cautions, “Any computer system that decreases the availability of the personal touch is simply unacceptable.” In a system’s ability to process information, O’Brien suggests that in the past businesses were so enamored with just having an automated system that they failed to take full advantage of its capabilities. O’Brien says that a lot of savings, in time and money, went un-noticed and/or under-appreciated. Today, everyone has an integrated system, and the wise business manager is the business manager who takes full advantage of all his resources. O’Brien invokes the 80/20 rule, and offers, “…the last 20 percent of results costs 80 percent of the effort. But times have changed. Everyone is automating data management, so that last 20 percent that could be ignored a short time ago represents a significant competitive differential.” The evolution of data systems takes center stage for much of the article; pitfalls also emerge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction organizations typically deal with large volumes of project data containing valuable information. It is found that these organizations do not use these data effectively for planning and decision-making. There are two reasons. First, the information systems in construction organizations are designed to support day-to-day construction operations. The data stored in these systems are often non-validated, non-integrated and are available in a format that makes it difficult for decision makers to use in order to make timely decisions. Second, the organizational structure and the IT infrastructure are often not compatible with the information systems thereby resulting in higher operational costs and lower productivity. These two issues have been investigated in this research with the objective of developing systems that are structured for effective decision-making. ^ A framework was developed to guide storage and retrieval of validated and integrated data for timely decision-making and to enable construction organizations to redesign their organizational structure and IT infrastructure matched with information system capabilities. The research was focused on construction owner organizations that were continuously involved in multiple construction projects. Action research and Data warehousing techniques were used to develop the framework. ^ One hundred and sixty-three construction owner organizations were surveyed in order to assess their data needs, data management practices and extent of use of information systems in planning and decision-making. For in-depth analysis, Miami-Dade Transit (MDT) was selected which is in-charge of all transportation-related construction projects in the Miami-Dade county. A functional model and a prototype system were developed to test the framework. The results revealed significant improvements in data management and decision-support operations that were examined through various qualitative (ease in data access, data quality, response time, productivity improvement, etc.) and quantitative (time savings and operational cost savings) measures. The research results were first validated by MDT and then by a representative group of twenty construction owner organizations involved in various types of construction projects. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction organizations typically deal with large volumes of project data containing valuable information. It is found that these organizations do not use these data effectively for planning and decision-making. There are two reasons. First, the information systems in construction organizations are designed to support day-to-day construction operations. The data stored in these systems are often non-validated, nonintegrated and are available in a format that makes it difficult for decision makers to use in order to make timely decisions. Second, the organizational structure and the IT infrastructure are often not compatible with the information systems thereby resulting in higher operational costs and lower productivity. These two issues have been investigated in this research with the objective of developing systems that are structured for effective decision-making. A framework was developed to guide storage and retrieval of validated and integrated data for timely decision-making and to enable construction organizations to redesign their organizational structure and IT infrastructure matched with information system capabilities. The research was focused on construction owner organizations that were continuously involved in multiple construction projects. Action research and Data warehousing techniques were used to develop the framework. One hundred and sixty-three construction owner organizations were surveyed in order to assess their data needs, data management practices and extent of use of information systems in planning and decision-making. For in-depth analysis, Miami-Dade Transit (MDT) was selected which is in-charge of all transportation-related construction projects in the Miami-Dade county. A functional model and a prototype system were developed to test the framework. The results revealed significant improvements in data management and decision-support operations that were examined through various qualitative (ease in data access, data quality, response time, productivity improvement, etc.) and quantitative (time savings and operational cost savings) measures. The research results were first validated by MDT and then by a representative group of twenty construction owner organizations involved in various types of construction projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present 8 yr of long-term water quality, climatological, and water management data for 17 locations in Everglades National Park, Florida. Total phosphorus (P) concentration data from freshwater sites (typically ,0.25 mmol L21, or 8 mg L21) indicate the oligotrophic, P-limited nature of this large freshwater–estuarine landscape. Total P concentrations at estuarine sites near the Gulf of Mexico (average ø0.5 m mol L21) demonstrate the marine source for this limiting nutrient. This ‘‘upside down’’ phenomenon, with the limiting nutrient supplied by the ocean and not the land, is a defining characteristic of the Everglade landscape. We present a conceptual model of how the seasonality of precipitation and the management of canal water inputs control the marine P supply, and we hypothesize that seasonal variability in water residence time controls water quality through internal biogeochemical processing. Low freshwater inflows during the dry season increase estuarine residence times, enabling local processes to control nutrient availability and water quality. El Nin˜o–Southern Oscillation (ENSO) events tend to mute the seasonality of rainfall without altering total annual precipitation inputs. The Nin˜o3 ENSO index (which indicates an ENSO event when positive and a La Nin˜a event when negative) was positively correlated with both annual rainfall and the ratio of dry season to wet season precipitation. This ENSO-driven disruption in seasonal rainfall patterns affected salinity patterns and tended to reduce marine inputs of P to Everglades estuaries. ENSO events also decreased dry season residence times, reducing the importance of estuarine nutrient processing. The combination of variable water management activities and interannual differences in precipitation patterns has a strong influence on nutrient and salinity patterns in Everglades estuaries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the article - Past, Present, and Future: The Food Service Industry and Its Changes - by Brother Herman E. Zaccarelli, International Director, Restaurant, Hotel and Institutional Management Institute at Purdue University, Brother Zaccarelli initially states: “Educators play an important role in the evolution of the food service industry. The author discusses that evolution and suggests how educators can be change agents along with management in that evolutionary progression.” The author goes on to wax philosophically, as well as speak generically about the food service industry; to why it offers fascinating and rewarding careers. Additionally, he writes about the influence educators have on students in this regard. “Educators can speak about how the food service industry has benefited them both personally and professionally,” says Brother Zaccarelli. “We get excited about alerting students to the many opportunities and, in fact, serve as “salespersons” for the industry to whoever (school administrators, legislators, and peers in the educational institution) will listen.” Brother Zaccarelli also speaks to growth and changes in food service, and even more importantly about the people and faces behind everything that food service, and hospitality in general comprise. The author will have you know, that people are what drive an educator. “What makes the food service industry so great? At the heart of this question's answer is people: the people whom it serves in institutional and commercial operations of all types; the people who work within it; the people who provide the goods, services, and equipment to it; the people who study it,” says Brother Zaccarelli. “All of these groups have, of course, a vested personal and/or professional interest in seeing our industry improve.” Another concept the author would like you to absorb, and it’s even more so true today than yesterday, is the prevalence of convergence and divergence within food service. For food service and beyond, it is the common denominators and differences that make the hospitality-food service industry so dynamic and vibrant. These are the winds of change presented to an educator who wants to have a positive impact on students. The author warns that the many elements involved in the food service industry conspire to erode quality of service in an industry that is also persistently expanding, and whose cornerstone principles are underpinned by service itself. “The three concerns addressed - quality, employees, and marketing - are intimately related,” Brother Zaccarelli says in stripping-down the industry to bare essentials. He defines and addresses the issues related to each with an eye toward how education can reconcile said issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors apply economic theory to an analysis of industry pricing. Data from a cross-section of San Francisco hotels is used to estimate the implicit prices of common hotel amenities, and a procedure for using these prices to estimate consumer demands for the attributes is outlined. The authors then suggest implications for hotel decision makers. While the results presented here should not be generalized to other markets, the methodology is easily adapted to other geographic areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern civilization has developed principally through man's harnessing of forces. For centuries man had to rely on wind, water and animal force as principal sources of power. The advent of the industrial revolution, electrification and the development of new technologies led to the application of wood, coal, gas, petroleum, and uranium to fuel new industries, produce goods and means of transportation, and generate the electrical energy which has become such an integral part of our lives. The geometric growth in energy consumption, coupled with the world's unrestricted growth in population, has caused a disproportionate use of these limited natural resources. The resulting energy predicament could have serious consequences within the next half century unless we commit ourselves to the philosophy of effective energy conservation and management. National legislation, along with the initiative of private industry and growing interest in the private sector has played a major role in stimulating the adoption of energy-conserving laws, technologies, measures, and practices. It is a matter of serious concern in the United States, where ninety-five percent of the commercial and industrial facilities which will be standing in the year 2000 - many in need of retrofit - are currently in place. To conserve energy, it is crucial to first understand how a facility consumes energy, how its users' needs are met, and how all internal and external elements interrelate. To this purpose, the major thrust of this report will be to emphasize the need to develop an energy conservation plan that incorporates energy auditing and surveying techniques. Numerous energy-saving measures and practices will be presented ranging from simple no-cost opportunities to capital intensive investments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.