891 resultados para contrast analysis
Resumo:
Search engines have forever changed the way people access and discover knowledge, allowing information about almost any subject to be quickly and easily retrieved within seconds. As increasingly more material becomes available electronically the influence of search engines on our lives will continue to grow. This presents the problem of how to find what information is contained in each search engine, what bias a search engine may have, and how to select the best search engine for a particular information need. This research introduces a new method, search engine content analysis, in order to solve the above problem. Search engine content analysis is a new development of traditional information retrieval field called collection selection, which deals with general information repositories. Current research in collection selection relies on full access to the collection or estimations of the size of the collections. Also collection descriptions are often represented as term occurrence statistics. An automatic ontology learning method is developed for the search engine content analysis, which trains an ontology with world knowledge of hundreds of different subjects in a multilevel taxonomy. This ontology is then mined to find important classification rules, and these rules are used to perform an extensive analysis of the content of the largest general purpose Internet search engines in use today. Instead of representing collections as a set of terms, which commonly occurs in collection selection, they are represented as a set of subjects, leading to a more robust representation of information and a decrease of synonymy. The ontology based method was compared with ReDDE (Relevant Document Distribution Estimation method for resource selection) using the standard R-value metric, with encouraging results. ReDDE is the current state of the art collection selection method which relies on collection size estimation. The method was also used to analyse the content of the most popular search engines in use today, including Google and Yahoo. In addition several specialist search engines such as Pubmed and the U.S. Department of Agriculture were analysed. In conclusion, this research shows that the ontology based method mitigates the need for collection size estimation.
Resumo:
PURPOSE: To explore the effects of glaucoma and aging on low-spatial-frequency contrast sensitivity by using tests designed to assess performance of either the magnocellular (M) or parvocellular (P) visual pathways. METHODS: Contrast sensitivity was measured for spatial frequencies of 0.25 to 2 cyc/deg by using a published steady- and pulsed-pedestal approach. Sixteen patients with glaucoma and 16 approximately age-matched control subjects participated. Patients with glaucoma were tested foveally and at two midperipheral locations: (1) an area of early visual field loss, and (2) an area of normal visual field. Control subjects were assessed in matched locations. An additional group of 12 younger control subjects (aged 20-35 years) were also tested. RESULTS: Older control subjects demonstrated reduced sensitivity relative to the younger group for the steady (presumed M)- and pulsed (presumed P)-pedestal conditions. Sensitivity was reduced foveally and in the midperiphery across the spatial frequency range. In the area of early visual field loss, the glaucoma group demonstrated further sensitivity reduction relative to older control subjects across the spatial frequency range for both the steady- and pulsed-pedestal tasks. Sensitivity was also reduced in the midperipheral location of "normal" visual field for the pulsed condition. CONCLUSIONS: Normal aging results in a reduction of contrast sensitivity for the low-spatial-frequency-sensitive components of both the M and P pathways. Glaucoma results in a further reduction of sensitivity that is not selective for M or P function. The low-spatial-frequency-sensitive channels of both pathways, which are presumably mediated by cells with larger receptive fields, are approximately equivalently impaired in early glaucoma.
Resumo:
This paper presents an investigation into the properties of a new narrative technique for career assessment and counselling, My Career Chapter: A Dialogical Autobiography. This technique is used to facilitate clients’ construction of a meaningful career-related autobiography. Previous research indicates the usefulness of My Career Chapter for adult clients and its alignment with recommendations for the development and application of qualitative assessment and counselling techniques. This study specifically commences research into the technique’s applicability for adolescents. A focus group, comprised of guidance counselling professionals whose work primarily pertained to the needs of adolescents, found that there is potential to develop a version of My Career Chapter that is suitable for adolescents.
Resumo:
This paper summarizes the papers presented in the thematic stream Models for the Analysis of Individual and Group Needs, at the 2007 IAEVG-SVP-NCDA Symposium: Vocational Psychology and Career Guidance Practice: An International Partnership. The predominant theme which emerged from the papers was that theory and practice need to be positioned within their contexts. For this paper, context has been formulated as a dimension ranging from the individual’s experience of himself or herself in conversations, including interpersonal transactions and body culture, through to broad higher levels of education, work, nation, and economy.
Resumo:
This thesis focuses on the volatile and hygroscopic properties of mixed aerosol species. In particular, the influence organic species of varying solubility have upon seed aerosols. Aerosol studies were conducted at the Paul Scherrer Institut Laboratory for Atmospheric Chemistry (PSI-LAC, Villigen, Switzerland) and at the Queensland University of Technology International Laboratory for Air Quality and Health (QUT-ILAQH, Brisbane, Australia). The primary measurement tool employed in this program was the Volatilisation and Hygroscopicity Tandem Differential Mobility Analyser (VHTDMA - Johnson et al. 2004). This system was initially developed at QUT within the ILAQH and was completely re-developed as part of this project (see Section 1.4 for a description of this process). The new VHTDMA was deployed to the PSI-LAC where an analysis of the volatile and hygroscopic properties of ammonium sulphate seeds coated with organic species formed from the photo-oxidation of á-pinene was conducted. This investigation was driven by a desire to understand the influence of atmospherically prevalent organics upon water uptake by material with cloud forming capabilities. Of particular note from this campaign were observed influences of partially soluble organic coatings upon inorganic ammonium sulphate seeds above and below their deliquescence relative humidity (DRH). Above the DRH of the seed increasing the volume fraction of the organic component was shown to reduce the water uptake of the mixed particle. Below the DRH the organic was shown to activate the water uptake of the seed. This was the first time this effect had been observed for á-pinene derived SOA. In contrast with the simulated aerosols generated at the PSI-LAC a case study of the volatile and hygroscopic properties of diesel emissions was undertaken. During this stage of the project ternary nucleation was shown, for the first time, to be one of the processes involved in formation of diesel particulate matter. Furthermore, these particles were shown to be coated with a volatile hydrophobic material which prevented the water uptake of the highly hygroscopic material below. This result was a first and indicated that previous studies into the hygroscopicity of diesel emission had erroneously reported the particles to be hydrophobic. Both of these results contradict the previously upheld Zdanovksii-Stokes-Robinson (ZSR) additive rule for water uptake by mixed species. This is an important contribution as it adds to the weight of evidence that limits the validity of this rule.
Resumo:
The rising problems associated with construction such as decreasing quality and productivity, labour shortages, occupational safety, and inferior working conditions have opened the possibility of more revolutionary solutions within the industry. One prospective option is in the implementation of innovative technologies such as automation and robotics, which has the potential to improve the industry in terms of productivity, safety and quality. The construction work site could, theoretically, be contained in a safer environment, with more efficient execution of the work, greater consistency of the outcome and higher level of control over the production process. By identifying the barriers to construction automation and robotics implementation in construction, and investigating ways in which to overcome them, contributions could be made in terms of better understanding and facilitating, where relevant, greater use of these technologies in the construction industry so as to promote its efficiency. This research aims to ascertain and explain the barriers to construction automation and robotics implementation by exploring and establishing the relationship between characteristics of the construction industry and attributes of existing construction automation and robotics technologies to level of usage and implementation in three selected countries; Japan, Australia and Malaysia. These three countries were chosen as their construction industry characteristics provide contrast in terms of culture, gross domestic product, technology application, organisational structure and labour policies. This research uses a mixed method approach of gathering data, both quantitative and qualitative, by employing a questionnaire survey and an interview schedule; using a wide range of sample from management through to on-site users, working in a range of small (less than AUD0.2million) to large companies (more than AUD500million), and involved in a broad range of business types and construction sectors. Detailed quantitative (statistical) and qualitative (content) data analysis is performed to provide a set of descriptions, relationships, and differences. The statistical tests selected for use include cross-tabulations, bivariate and multivariate analysis for investigating possible relationships between variables; and Kruskal-Wallis and Mann Whitney U test of independent samples for hypothesis testing and inferring the research sample to the construction industry population. Findings and conclusions arising from the research work which include the ranking schemes produced for four key areas of, the construction attributes on level of usage; barrier variables; differing levels of usage between countries; and future trends, have established a number of potential areas that could impact the level of implementation both globally and for individual countries.
Resumo:
Principal topic: Effectuation theory suggests that entrepreneurs develop their new ventures in an iterative way by selecting possibilities through flexibility and interactions with the market; a focus on affordability of loss rather than maximal return on the capital invested, and the development of pre-commitments and alliances from stakeholders (Sarasvathy, 2001, 2008; Sarasvathy et al., 2005, 2006). In contrast, causation may be described as a rationalistic reasoning method to create a company. After a comprehensive market analysis to discover opportunities, the entrepreneur will select the alternative with the higher expected return and implement it through the use of a business plan. However, little is known about the consequences of following either of these two processes. One aspect that remains unclear is the relationship between newness and effectuation. On one hand it can be argued that the combination of a means-centered, interactive (through pre-commitments and alliances with stakeholders from the early phases of the venture creation) and open-minded process (through flexibility of exploiting contingencies) should encourage and facilitate the development of innovative solutions. On the other hand, having a close relationship with their “future first customers” and focussing too much on the resources and knowledge already within the firm may be a constraint that is not conducive to innovation, or at least not to a radical innovation. While it has been suggested that effectuation strategy is more likely to be used by innovative entrepreneurs (Sarasvathy, 2001), this hypothesis has not been demonstrated yet (Sarasvathy, 2001). Method: In our attempt to capture newness in its different aspects we have considered the following four domains where newness may happen: new product/service; new method for promotion and sales; new production methods/sourcing; market creation. We identified how effectuation may be differently associated with these four domains of newness. To test our four sets of hypotheses a dataset of 1329 firms (702 nascent and 627 young firms) randomly selected in Australia was examined through ANOVA Tukey HSD Test. Results and Implications: Results indicate the existence of a curvilinear relationship between effectuation and newness where low and high levels of newness are associated with low level of effectuation while medium level of newness is associated with high level of effectuation. Implications for academia, practitioners and policy makers are also discussed.
Resumo:
Selecting an appropriate business process modelling technique forms an important task within the methodological challenges of a business process management project. While a plethora of available techniques has been developed over the last decades, there is an obvious shortage of well-accepted reference frameworks that can be used to evaluate and compare the capabilities of the different techniques. Academic progress has been made at least in the area of representational analyses that use ontology as a benchmark for such evaluations. This paper reflects on the comprehensive experiences with the application of a model based on the Bunge ontology in this context. A brief overview of the underlying research model characterizes the different steps in such a research project. A comparative summary of previous representational analyses of process modelling techniques over time gives insights into the relative maturity of selected process modelling techniques. Based on these experiences suggestions are made as to where ontology-based representational analyses could be further developed and what limitations are inherent to such analyses.
Resumo:
The availability of innumerable intelligent building (IB) products, and the current dearth of inclusive building component selection methods suggest that decision makers might be confronted with the quandary of forming a particular combination of components to suit the needs of a specific IB project. Despite this problem, few empirical studies have so far been undertaken to analyse the selection of the IB systems, and to identify key selection criteria for major IB systems. This study is designed to fill these research gaps. Two surveys: a general survey and the analytic hierarchy process (AHP) survey are proposed to achieve these objectives. The first general survey aims to collect general views from IB experts and practitioners to identify the perceived critical selection criteria, while the AHP survey was conducted to prioritize and assign the important weightings for the perceived criteria in the general survey. Results generally suggest that each IB system was determined by a disparate set of selection criteria with different weightings. ‘Work efficiency’ is perceived to be most important core selection criterion for various IB systems, while ‘user comfort’, ‘safety’ and ‘cost effectiveness’ are also considered to be significant. Two sub-criteria, ‘reliability’ and ‘operating and maintenance costs’, are regarded as prime factors to be considered in selecting IB systems. The current study contributes to the industry and IB research in at least two aspects. First, it widens the understanding of the selection criteria, as well as their degree of importance, of the IB systems. It also adopts a multi-criteria AHP approach which is a new method to analyse and select the building systems in IB. Further research would investigate the inter-relationship amongst the selection criteria.
Resumo:
More than a century ago in their definitive work “The Right to Privacy” Samuel D. Warren and Louis D. Brandeis highlighted the challenges posed to individual privacy by advancing technology. Today’s workplace is characterised by its reliance on computer technology, particularly the use of email and the Internet to perform critical business functions. Increasingly these and other workplace activities are the focus of monitoring by employers. There is little formal regulation of electronic monitoring in Australian or United States workplaces. Without reasonable limits or controls, this has the potential to adversely affect employees’ privacy rights. Australia has a history of legislating to protect privacy rights, whereas the United States has relied on a combination of constitutional guarantees, federal and state statutes, and the common law. This thesis examines a number of existing and proposed statutory and other workplace privacy laws in Australia and the United States. The analysis demonstrates that existing measures fail to adequately regulate monitoring or provide employees with suitable remedies where unjustifiable intrusions occur. The thesis ultimately supports the view that enacting uniform legislation at the national level provides a more effective and comprehensive solution for both employers and employees. Chapter One provides a general introduction and briefly discusses issues relevant to electronic monitoring in the workplace. Chapter Two contains an overview of privacy law as it relates to electronic monitoring in Australian and United States workplaces. In Chapter Three there is an examination of the complaint process and remedies available to a hypothetical employee (Mary) who is concerned about protecting her privacy rights at work. Chapter Four provides an analysis of the major themes emerging from the research, and also discusses the draft national uniform legislation. Chapter Five details the proposed legislation in the form of the Workplace Surveillance and Monitoring Act, and Chapter Six contains the conclusion.
Resumo:
Experience plays an important role in building management. “How often will this asset need repair?” or “How much time is this repair going to take?” are types of questions that project and facility managers face daily in planning activities. Failure or success in developing good schedules, budgets and other project management tasks depend on the project manager's ability to obtain reliable information to be able to answer these types of questions. Young practitioners tend to rely on information that is based on regional averages and provided by publishing companies. This is in contrast to experienced project managers who tend to rely heavily on personal experience. Another aspect of building management is that many practitioners are seeking to improve available scheduling algorithms, estimating spreadsheets and other project management tools. Such “micro-scale” levels of research are important in providing the required tools for the project manager's tasks. However, even with such tools, low quality input information will produce inaccurate schedules and budgets as output. Thus, it is also important to have a broad approach to research at a more “macro-scale.” Recent trends show that the Architectural, Engineering, Construction (AEC) industry is experiencing explosive growth in its capabilities to generate and collect data. There is a great deal of valuable knowledge that can be obtained from the appropriate use of this data and therefore the need has arisen to analyse this increasing amount of available data. Data Mining can be applied as a powerful tool to extract relevant and useful information from this sea of data. Knowledge Discovery in Databases (KDD) and Data Mining (DM) are tools that allow identification of valid, useful, and previously unknown patterns so large amounts of project data may be analysed. These technologies combine techniques from machine learning, artificial intelligence, pattern recognition, statistics, databases, and visualization to automatically extract concepts, interrelationships, and patterns of interest from large databases. The project involves the development of a prototype tool to support facility managers, building owners and designers. This final report presents the AIMMTM prototype system and documents how and what data mining techniques can be applied, the results of their application and the benefits gained from the system. The AIMMTM system is capable of searching for useful patterns of knowledge and correlations within the existing building maintenance data to support decision making about future maintenance operations. The application of the AIMMTM prototype system on building models and their maintenance data (supplied by industry partners) utilises various data mining algorithms and the maintenance data is analysed using interactive visual tools. The application of the AIMMTM prototype system to help in improving maintenance management and building life cycle includes: (i) data preparation and cleaning, (ii) integrating meaningful domain attributes, (iii) performing extensive data mining experiments in which visual analysis (using stacked histograms), classification and clustering techniques, associative rule mining algorithm such as “Apriori” and (iv) filtering and refining data mining results, including the potential implications of these results for improving maintenance management. Maintenance data of a variety of asset types were selected for demonstration with the aim of discovering meaningful patterns to assist facility managers in strategic planning and provide a knowledge base to help shape future requirements and design briefing. Utilising the prototype system developed here, positive and interesting results regarding patterns and structures of data have been obtained.
Resumo:
Although the benefits of service orientation are prevalent in literature, a review, analysis, and evaluation of the 30 existing service analysis approaches presented in this paper have shown that a comprehensive approach to the identification and analysis of both business and supporting software services is missing. Based on this evaluation of existing approaches and additional sources, we close this gap by proposing an integrated, consolidated approach to business and software service analysis that combines and extends the strengths of the examined methodologies.
Resumo:
The service-orientation paradigm has not only become prevalent in the software systems domain in recent years, but is also increasingly applied on the business level to restructure organisational capabilities. In this paper, we present the results of an extensive literature review of 30 approaches related to service identification and analysis for both domains. Based on the consolidation of a superset of comparison criteria for service-oriented methodologies found in related literature, we compare and evaluate the different characteristics of service engineering methods with a focus on service analysis. Although a close business and IT alignment is regarded as one of the core beneficial promises of service-orientation, our analysis suggests that there is a lack of unified, comprehensive methodology for service identification and analysis integrating and addressing both domains. Thus, we discuss how our results can inform directions for future research in this area.