991 resultados para Explosive growth
Resumo:
Experience plays an important role in building management. “How often will this asset need repair?” or “How much time is this repair going to take?” are types of questions that project and facility managers face daily in planning activities. Failure or success in developing good schedules, budgets and other project management tasks depend on the project manager's ability to obtain reliable information to be able to answer these types of questions. Young practitioners tend to rely on information that is based on regional averages and provided by publishing companies. This is in contrast to experienced project managers who tend to rely heavily on personal experience. Another aspect of building management is that many practitioners are seeking to improve available scheduling algorithms, estimating spreadsheets and other project management tools. Such “micro-scale” levels of research are important in providing the required tools for the project manager's tasks. However, even with such tools, low quality input information will produce inaccurate schedules and budgets as output. Thus, it is also important to have a broad approach to research at a more “macro-scale.” Recent trends show that the Architectural, Engineering, Construction (AEC) industry is experiencing explosive growth in its capabilities to generate and collect data. There is a great deal of valuable knowledge that can be obtained from the appropriate use of this data and therefore the need has arisen to analyse this increasing amount of available data. Data Mining can be applied as a powerful tool to extract relevant and useful information from this sea of data. Knowledge Discovery in Databases (KDD) and Data Mining (DM) are tools that allow identification of valid, useful, and previously unknown patterns so large amounts of project data may be analysed. These technologies combine techniques from machine learning, artificial intelligence, pattern recognition, statistics, databases, and visualization to automatically extract concepts, interrelationships, and patterns of interest from large databases. The project involves the development of a prototype tool to support facility managers, building owners and designers. This final report presents the AIMMTM prototype system and documents how and what data mining techniques can be applied, the results of their application and the benefits gained from the system. The AIMMTM system is capable of searching for useful patterns of knowledge and correlations within the existing building maintenance data to support decision making about future maintenance operations. The application of the AIMMTM prototype system on building models and their maintenance data (supplied by industry partners) utilises various data mining algorithms and the maintenance data is analysed using interactive visual tools. The application of the AIMMTM prototype system to help in improving maintenance management and building life cycle includes: (i) data preparation and cleaning, (ii) integrating meaningful domain attributes, (iii) performing extensive data mining experiments in which visual analysis (using stacked histograms), classification and clustering techniques, associative rule mining algorithm such as “Apriori” and (iv) filtering and refining data mining results, including the potential implications of these results for improving maintenance management. Maintenance data of a variety of asset types were selected for demonstration with the aim of discovering meaningful patterns to assist facility managers in strategic planning and provide a knowledge base to help shape future requirements and design briefing. Utilising the prototype system developed here, positive and interesting results regarding patterns and structures of data have been obtained.
Resumo:
Experience plays an important role in building management. “How often will this asset need repair?” or “How much time is this repair going to take?” are types of questions that project and facility managers face daily in planning activities. Failure or success in developing good schedules, budgets and other project management tasks depend on the project manager's ability to obtain reliable information to be able to answer these types of questions. Young practitioners tend to rely on information that is based on regional averages and provided by publishing companies. This is in contrast to experienced project managers who tend to rely heavily on personal experience. Another aspect of building management is that many practitioners are seeking to improve available scheduling algorithms, estimating spreadsheets and other project management tools. Such “micro-scale” levels of research are important in providing the required tools for the project manager's tasks. However, even with such tools, low quality input information will produce inaccurate schedules and budgets as output. Thus, it is also important to have a broad approach to research at a more “macro-scale.” Recent trends show that the Architectural, Engineering, Construction (AEC) industry is experiencing explosive growth in its capabilities to generate and collect data. There is a great deal of valuable knowledge that can be obtained from the appropriate use of this data and therefore the need has arisen to analyse this increasing amount of available data. Data Mining can be applied as a powerful tool to extract relevant and useful information from this sea of data. Knowledge Discovery in Databases (KDD) and Data Mining (DM) are tools that allow identification of valid, useful, and previously unknown patterns so large amounts of project data may be analysed. These technologies combine techniques from machine learning, artificial intelligence, pattern recognition, statistics, databases, and visualization to automatically extract concepts, interrelationships, and patterns of interest from large databases. The project involves the development of a prototype tool to support facility managers, building owners and designers. This Industry focused report presents the AIMMTM prototype system and documents how and what data mining techniques can be applied, the results of their application and the benefits gained from the system. The AIMMTM system is capable of searching for useful patterns of knowledge and correlations within the existing building maintenance data to support decision making about future maintenance operations. The application of the AIMMTM prototype system on building models and their maintenance data (supplied by industry partners) utilises various data mining algorithms and the maintenance data is analysed using interactive visual tools. The application of the AIMMTM prototype system to help in improving maintenance management and building life cycle includes: (i) data preparation and cleaning, (ii) integrating meaningful domain attributes, (iii) performing extensive data mining experiments in which visual analysis (using stacked histograms), classification and clustering techniques, associative rule mining algorithm such as “Apriori” and (iv) filtering and refining data mining results, including the potential implications of these results for improving maintenance management. Maintenance data of a variety of asset types were selected for demonstration with the aim of discovering meaningful patterns to assist facility managers in strategic planning and provide a knowledge base to help shape future requirements and design briefing. Utilising the prototype system developed here, positive and interesting results regarding patterns and structures of data have been obtained.
Resumo:
The explosive growth of the World-Wide-Web and the emergence of ecommerce are the major two factors that have led to the development of recommender systems (Resnick and Varian, 1997). The main task of recommender systems is to learn from users and recommend items (e.g. information, products or books) that match the users’ personal preferences. Recommender systems have been an active research area for more than a decade. Many different techniques and systems with distinct strengths have been developed to generate better quality recommendations. One of the main factors that affect recommenders’ recommendation quality is the amount of information resources that are available to the recommenders. The main feature of the recommender systems is their ability to make personalised recommendations for different individuals. However, for many ecommerce sites, it is difficult for them to obtain sufficient knowledge about their users. Hence, the recommendations they provided to their users are often poor and not personalised. This information insufficiency problem is commonly referred to as the cold-start problem. Most existing research on recommender systems focus on developing techniques to better utilise the available information resources to achieve better recommendation quality. However, while the amount of available data and information remains insufficient, these techniques can only provide limited improvements to the overall recommendation quality. In this thesis, a novel and intuitive approach towards improving recommendation quality and alleviating the cold-start problem is attempted. This approach is enriching the information resources. It can be easily observed that when there is sufficient information and knowledge base to support recommendation making, even the simplest recommender systems can outperform the sophisticated ones with limited information resources. Two possible strategies are suggested in this thesis to achieve the proposed information enrichment for recommenders: • The first strategy suggests that information resources can be enriched by considering other information or data facets. Specifically, a taxonomy-based recommender, Hybrid Taxonomy Recommender (HTR), is presented in this thesis. HTR exploits the relationship between users’ taxonomic preferences and item preferences from the combination of the widely available product taxonomic information and the existing user rating data, and it then utilises this taxonomic preference to item preference relation to generate high quality recommendations. • The second strategy suggests that information resources can be enriched simply by obtaining information resources from other parties. In this thesis, a distributed recommender framework, Ecommerce-oriented Distributed Recommender System (EDRS), is proposed. The proposed EDRS allows multiple recommenders from different parties (i.e. organisations or ecommerce sites) to share recommendations and information resources with each other in order to improve their recommendation quality. Based on the results obtained from the experiments conducted in this thesis, the proposed systems and techniques have achieved great improvement in both making quality recommendations and alleviating the cold-start problem.
Resumo:
Due to the explosive growth of the Web, the domain of Web personalization has gained great momentum both in the research and commercial areas. One of the most popular web personalization systems is recommender systems. In recommender systems choosing user information that can be used to profile users is very crucial for user profiling. In Web 2.0, one facility that can help users organize Web resources of their interest is user tagging systems. Exploring user tagging behavior provides a promising way for understanding users’ information needs since tags are given directly by users. However, free and relatively uncontrolled vocabulary makes the user self-defined tags lack of standardization and semantic ambiguity. Also, the relationships among tags need to be explored since there are rich relationships among tags which could provide valuable information for us to better understand users. In this paper, we propose a novel approach for learning tag ontology based on the widely used lexical database WordNet for capturing the semantics and the structural relationships of tags. We present personalization strategies to disambiguate the semantics of tags by combining the opinion of WordNet lexicographers and users’ tagging behavior together. To personalize further, clustering of users is performed to generate a more accurate ontology for a particular group of users. In order to evaluate the usefulness of the tag ontology, we use the tag ontology in a pilot tag recommendation experiment for improving the recommendation performance by exploiting the semantic information in the tag ontology. The initial result shows that the personalized information has improved the accuracy of the tag recommendation.
Resumo:
With the explosive growth of resources available through the Internet, information mismatching and overload have become a severe concern to users. Web users are commonly overwhelmed by huge volume of information and are faced with the challenge of finding the most relevant and reliable information in a timely manner. Personalised information gathering and recommender systems represent state-of-the-art tools for efficient selection of the most relevant and reliable information resources, and the interest in such systems has increased dramatically over the last few years. However, web personalization has not yet been well-exploited; difficulties arise while selecting resources through recommender systems from a technological and social perspective. Aiming to promote high quality research in order to overcome these challenges, this paper provides a comprehensive survey on the recent work and achievements in the areas of personalised web information gathering and recommender systems. The report covers concept-based techniques exploited in personalised information gathering and recommender systems.
Resumo:
The explosive growth in the development of Traditional Chinese Medicine (TCM) has resulted in the continued increase in clinical and research data. The lack of standardised terminology, flaws in data quality planning and management of TCM informatics are preventing clinical decision-making, drug discovery and education. This paper argues that the introduction of data warehousing technologies to enhance the effectiveness and durability in TCM is paramount. To showcase the role of data warehousing in the improvement of TCM, this paper presents a practical model for data warehousing with detailed explanation, which is based on the structured electronic records, for TCM clinical researches and medical knowledge discovery.
Resumo:
The fossil fuel divestment movement has undergone explosive growth over the last few years - expanding from encouraging educational institutions to adopt ethical investment policies to focusing upon cities, pension funds and philanthropic charities. The fossil fuel divestment movement has attained global ambitions - challenging sovereign wealth funds and national governments to engage in fossil fuel divestment, and pushing for fossil fuel divestment at international climate talks - such as the Paris Climate Summit in 2015. By exploring and analysing a key campaign to 'Divest Norway', this chapter considers the efforts to globalise and internationalise the fossil fuel divestment campaign. Part 1 explores the origins of the fossil fuel divestment movement, and the application of such strategies in a variety of contexts. Part 2 looks at the campaign to divest Norway's sovereign wealth fund of fossil fuel investments. There has been much discussion as to whether the bold decision of Norway to engage in coal divestment will encourage and inspire other sovereign wealth funds to engage in fossil fuel divestment. The conclusion considers the efforts to introduce fossil fuel divestment as a policy initiative for nation states as a policy option in international climate law.
Resumo:
The development of new synthetic strategies to obtain mono-disperse metal nanoparticles on large scales is an attractive prospect in the context of sustainability. Recently, amine-boranes, the classical Lewis acid-base adducts, have been employed as reducing agents for the synthesis of metal nanoparticles. They offer several advantages over the traditional reducing agents like the borohydrides; for example, a much better control of the rate of reduction and, hence, the particle size distribution of metal nanoparticles; diversity in their reducing abilities by varying the substituents on the nitrogen atom; and solubility in various protic and aprotic solvents. Amine-boranes have not only been used successfully as reducing agents in solution but also in solventless conditions, in which along with the reduction of the metal precursor, they undergo in situ transformation to afford the stabilizing agent for the generated metal nanoparticles, thereby bringing about atom economy as well. The use of amine boranes for the synthesis of metal nanoparticles has experienced an explosive growth in a very short period of time. In this Minireview, recent progress on the use of amine boranes for the synthesis of metal nanoparticles, with a focus towards the development of pathways for sustainability, is discussed.
Resumo:
This paper is a review prepared for the second Marseille Colloquium on the mechanics of turbulence, held in 2011, 50 years after the first. The review covers recent developments in our understanding of the large-scale dynamics of cumulus cloud flows and of the atmospheric boundary layer in the low-wind convective regime that is often encountered in the tropics. It has recently been shown that a variety of cumulus cloud forms and life cycles can be experimentally realized in the laboratory, with the transient diabatic plume taken as the flow model for a cumulus cloud. The plume is subjected to diabatic heating scaled to be dynamically similar to heat release from phase changes in clouds. The experiments are complemented by exact numerical solutions of the Navier-Stokes-Boussinesq equations for plumes with scaled off-source heating. The results show that the Taylor entrainment coefficient first increases with heating, reaches a positive maximum and then drops rapidly to zero or even negative values. This reduction in entrainment is a consequence of structural changes in the flow, smoothing out the convoluted boundaries in the non-diabatic plume, including the tongues engulfing the ambient flow. This is accompanied by a greater degree of mixedness in the core flow because of lower dilution by the ambient fluid. The cloud forms generated depend strongly on the history of the diabatic heating profile in the vertical direction. The striking effects of heating on the flow are attributable to the operation of the baroclinic torque due to the temperature field. The mean baroclinic torque is shown to peak around a quasi-cylindrical sheet situated midway between the axis of the flow and the edges. This torque is shear-enhancing and folds down the engulfment tongues. The increase in mixedness can be traced to an explosive growth in the enstrophy, triggered by a strong fluctuating baroclinic torque that acts as a source, especially at the higher wave numbers, thus enhancing the mixedness. In convective boundary layers field measurements show that, under conditions prevailing in the tropics, the eddy fluxes of momentum and energy do not follow the Monin-Obukhov similarity. Instead, the eddy momentum flux is found to be linear in the wind speed at low winds; and the eddy heat flux is, to a first approximation, governed by free convection laws, with wind acting as a small perturbation on a regime of free convection. A new boundary layer code, based on heat flux scaling rather than wall-stress scaling, shows promising improvements in predictive skills of a general circulation model.
Resumo:
Salvinia (Salvinia minima Willd.) is a water fern found in Florida waters, usually associated with Lemna and other small free-floating species. Due to its buoyancy and mat-forming abilities, it is spread by moving waters. In 1994, salvinia was reported to be present in 247 water bodies in the state (out of 451 surveyed public waters, Schardt 1997). It is a small, rapidly growing species that can become a nuisance due to its explosive growth rates and its ability to shade underwater life (Oliver 1993). Any efforts toward management of salvinia populations must consider that, in reasonable amounts, its presence is desirable since it plays an important role in the overall ecosystem balance. New management alternatives need to be explored besides the conventional herbicide treatments; for example, it has been shown that the growth of S. molesta can be inhibited by extracts of the tropical weed parthenium (Parthenium hysterophorus) and its purified toxin parthenin (Pande 1994, 1996). We believe that cattail, Typha spp. may be a candidate for control of S. minima infestations. Cattail is an aggressive aquatic plant, and has the ability to expand over areas that weren't previously occupied by other species (Gallardo et al. 1998a and references cited there). In South Florida, T. domingensis is a natural component of the Everglades ecosystem, but in many cases it has become the dominant marsh species, outcompeting other native plants. In Florida public waters, this cattail species is the most dominant emergent species of aquatic plants (Schardt 1997). Several factors enable it to accomplish opportunistic expansion, including size, growth habits, adaptability to changes in the surroundings, and the release of compounds that can prevent the growth and development of other species. We have been concerned in the past with the inhibitory effects of the T. domingensis extracts, and the phenolic compounds mentioned before, towards the growth and propagation of S. minima (Gallardo et al. 1998b). This investigation deals with the impact of cattail materials on the rates of oxygen production of salvinia, as determined through a series of Warburg experiments (Martin et al. 1987, Prindle and Martin 1996).
Resumo:
This thesis describes the expansion and improvement of the iterative in situ click chemistry OBOC peptide library screening technology. Previous work provided a proof-of-concept demonstration that this technique was advantageous for the production of protein-catalyzed capture (PCC) agents that could be used as drop-in replacements for antibodies in a variety of applications. Chapter 2 describes the technology development that was undertaken to optimize this screening process and make it readily available for a wide variety of targets. This optimization is what has allowed for the explosive growth of the PCC agent project over the past few years.
These technology improvements were applied to the discovery of PCC agents specific for single amino acid point mutations in proteins, which have many applications in cancer detection and treatment. Chapter 3 describes the use of a general all-chemical epitope-targeting strategy that can focus PCC agent development directly to a site of interest on a protein surface. This technique utilizes a chemically-synthesized chunk of the protein, called an epitope, substituted with a click handle in combination with the OBOC in situ click chemistry libraries in order to focus ligand development at a site of interest. Specifically, Chapter 3 discusses the use of this technique in developing a PCC agent specific for the E17K mutation of Akt1. Chapter 4 details the expansion of this ligand into a mutation-specific inhibitor, with applications in therapeutics.
Resumo:
The California fishery for red sea urchins, Strongylocentrotus franciscanus, has undergone explosive growth in recent years and is approaching full exploitation. Thus, there is considerable interest in enhancing stocks to maintain a high rate of landings. Fishable stocks of red sea urchins in different areas appear to be limited at three stages in their life history: By the availability of larvae, by the survival of newly settled to mid-sized animals, and by the food available to support growth and reproduction of larger animals. Here I review other efforts, notably the extensive Japanese work, to enhance fishable stocks of benthic marine invertebrates, and consider the potential options for red sea urchins at different points of limitation. These include collecting or culturing seed for outplanting, physical habitat improvement measures, improving the food supply, and conservation measures to protect existing stocks until alternate methods are proven and in place. The options are compared in terms of biological feasibility, capital and labor requirements, and potential implications for change in the structure of the fishing industry.
Resumo:
R. Jensen and Q. Shen, 'Fuzzy-Rough Attribute Reduction with Application to Web Categorization,' Fuzzy Sets and Systems, vol. 141, no. 3, pp. 469-485, 2004.
Resumo:
R. Jensen and Q. Shen, 'Webpage Classification with ACO-enhanced Fuzzy-Rough Feature Selection,' Proceedings of the Fifth International Conference on Rough Sets and Current Trends in Computing (RSCTC 2006), LNAI 4259, pp. 147-156, 2006.
Resumo:
Na última década tem-se assistido a um crescimento exponencial das redes de comunicações sem fios, nomeadamente no que se refere a taxa de penetração do serviço prestado e na implementação de novas infra-estruturas em todo o globo. É ponto assente neste momento que esta tendência irá não só continuar como se fortalecer devido à convergência que é esperada entre as redes móveis sem fio e a disponibilização de serviços de banda larga para a rede Internet fixa, numa evolução para um paradigma de uma arquitectura integrada e baseada em serviços e aplicações IP. Por este motivo, as comunicações móveis sem fios irão ter um papel fundamental no desenvolvimento da sociedade de informação a médio e longo prazos. A estratégia seguida no projecto e implementação das redes móveis celulares da actual geração (2G e 3G) foi a da estratificação da sua arquitectura protocolar numa estrutura modular em camadas estanques, onde cada camada do modelo é responsável pela implementação de um conjunto de funcionalidades. Neste modelo a comunicação dá-se apenas entre camadas adjacentes através de primitivas de comunicação pré-estabelecidas. Este modelo de arquitectura resulta numa mais fácil implementação e introdução de novas funcionalidades na rede. Entretanto, o facto das camadas inferiores do modelo protocolar não utilizarem informação disponibilizada pelas camadas superiores, e vice-versa acarreta uma degradação no desempenho do sistema. Este paradigma é particularmente importante quando sistemas de antenas múltiplas são implementados (sistemas MIMO). Sistemas de antenas múltiplas introduzem um grau adicional de liberdade no que respeita a atribuição de recursos rádio: o domínio espacial. Contrariamente a atribuição de recursos no domínio do tempo e da frequência, no domínio espacial os recursos rádio mapeados no domínio espacial não podem ser assumidos como sendo completamente ortogonais, devido a interferência resultante do facto de vários terminais transmitirem no mesmo canal e/ou slots temporais mas em feixes espaciais diferentes. Sendo assim, a disponibilidade de informação relativa ao estado dos recursos rádio às camadas superiores do modelo protocolar é de fundamental importância na satisfação dos critérios de qualidade de serviço exigidos. Uma forma eficiente de gestão dos recursos rádio exige a implementação de algoritmos de agendamento de pacotes de baixo grau de complexidade, que definem os níveis de prioridade no acesso a esses recursos por base dos utilizadores com base na informação disponibilizada quer pelas camadas inferiores quer pelas camadas superiores do modelo. Este novo paradigma de comunicação, designado por cross-layer resulta na maximização da capacidade de transporte de dados por parte do canal rádio móvel, bem como a satisfação dos requisitos de qualidade de serviço derivados a partir da camada de aplicação do modelo. Na sua elaboração, procurou-se que o standard IEEE 802.16e, conhecido por Mobile WiMAX respeitasse as especificações associadas aos sistemas móveis celulares de quarta geração. A arquitectura escalonável, o baixo custo de implementação e as elevadas taxas de transmissão de dados resultam num processo de multiplexagem de dados e valores baixos no atraso decorrente da transmissão de pacotes, os quais são atributos fundamentais para a disponibilização de serviços de banda larga. Da mesma forma a comunicação orientada à comutação de pacotes, inenente na camada de acesso ao meio, é totalmente compatível com as exigências em termos da qualidade de serviço dessas aplicações. Sendo assim, o Mobile WiMAX parece satisfazer os requisitos exigentes das redes móveis de quarta geração. Nesta tese procede-se à investigação, projecto e implementação de algoritmos de encaminhamento de pacotes tendo em vista a eficiente gestão do conjunto de recursos rádio nos domínios do tempo, frequência e espacial das redes móveis celulares, tendo como caso prático as redes móveis celulares suportadas no standard IEEE802.16e. Os algoritmos propostos combinam métricas provenientes da camada física bem como os requisitos de qualidade de serviço das camadas superiores, de acordo com a arquitectura de redes baseadas no paradigma do cross-layer. O desempenho desses algoritmos é analisado a partir de simulações efectuadas por um simulador de sistema, numa plataforma que implementa as camadas física e de acesso ao meio do standard IEEE802.16e.