241 resultados para Kahler metrics
Resumo:
The presence of insect pests in grain storages throughout the supply chain is a significant problem for farmers, grain handlers, and distributors world-wide. Insect monitoring and sampling programmes are used in the stored grains industry for the detection and estimation of pest populations. At the low pest densities dictated by economic and commercial requirements, the accuracy of both detection and abundance estimates can be influenced by variations in the spatial structure of pest populations over short distances. Geostatistical analysis of Rhyzopertha dominica populations in 2 and 3 dimensions showed that insect numbers were positively correlated over short (0.5 cm) distances, and negatively correlated over longer (.10 cm) distances. At 35 C, insects were located significantly further from the grain surface than at 25 and 30 C. Dispersion metrics showed statistically significant aggregation in all cases. The observed heterogeneous spatial distribution of R. dominica may also be influenced by factors such as the site of initial infestation and disturbance during handling. To account for these additional factors, I significantly extended a simulation model that incorporates both pest growth and movement through a typical stored-grain supply chain. By incorporating the effects of abundance, initial infestation site, grain handling, and treatment on pest spatial distribution, I developed a supply chain model incorporating estimates of pest spatial distribution. This was used to examine several scenarios representative of grain movement through a supply chain, and determine the influence of infestation location and grain disturbance on the sampling intensity required to detect pest infestations at various infestation rates. This study has investigated the effects of temperature, infestation point, and grain handling on the spatial distribution and detection of R. dominica. The proportion of grain infested was found to be dependent upon abundance, initial pest location, and grain handling. Simulation modelling indicated that accounting for these factors when developing sampling strategies for stored grain has the potential to significantly reduce sampling costs while simultaneously improving detection rate, resulting in reduced storage and pest management cost while improving grain quality.
Resumo:
This thesis highlights the limitations of the existing car following models to emulate driver behaviour for safety study purposes. It also compares the capabilities of the mainstream car following models emulating driver behaviour precise parameters such as headways and Time to Collisions. The comparison evaluates the robustness of each car following model for safety metric reproductions. A new car following model, based on the personal space concept and fish school model is proposed to simulate more precise traffic metrics. This new model is capable of reflecting changes in the headway distribution after imposing the speed limit form VSL systems. This research facilitates assessing Intelligent Transportation Systems on motorways, using microscopic simulation.
Resumo:
Cross-Lingual Link Discovery (CLLD) is a new problem in Information Retrieval. The aim is to automatically identify meaningful and relevant hypertext links between documents in different languages. This is particularly helpful in knowledge discovery if a multi-lingual knowledge base is sparse in one language or another, or the topical coverage in each language is different; such is the case with Wikipedia. Techniques for identifying new and topically relevant cross-lingual links are a current topic of interest at NTCIR where the CrossLink task has been running since the 2011 NTCIR-9. This paper presents the evaluation framework for benchmarking algorithms for cross-lingual link discovery evaluated in the context of NTCIR-9. This framework includes topics, document collections, assessments, metrics, and a toolkit for pooling, assessment, and evaluation. The assessments are further divided into two separate sets: manual assessments performed by human assessors; and automatic assessments based on links extracted from Wikipedia itself. Using this framework we show that manual assessment is more robust than automatic assessment in the context of cross-lingual link discovery.
Resumo:
Purpose: The challenges of providing housing that sustains its inhabitants socially, economically and environmentally, and is inherently sustainable for the planet as a whole, requires a holistic systems approach that considers the product, the supply chain and the market, as well as the inter-dependencies within and between each of these process points. The purpose of the research is to identify factors that impact the sustainability performance outcomes of residential dwellings and the diffusion of sustainable housing into the mainstream housing market. Design/methodology/approach: This research represents a snapshot in time: a recording of the experiences of seven Australian families who are “early adopters” of leading edge sustainable homes within a specific sustainable urban development in subtropical Queensland. The research adopts a qualitative approach to compare the goals and expectations of these families with the actual sustainability aspects incorporated into their homes and lifestyles. Findings: The results show that the “product” – a sustainable house – is difficult to define; that sustainability outcomes were strongly influenced by individual concerns and the contextual urban environment; and that economic comparisons with “standard” housing are challenging. Research limitations/implications: This qualitative study is based on seven families (13 individuals) in an Ecovillage in southeast Queensland. Although the findings make a significant contribution to knowledge, they may not be generalisable to the wider population. Originality/value: The experiences of these early adopter families suggest that the housing market and regulators play critical roles, through actions and language, in limiting or enhancing the diffusion of sustainable housing into the market.
Resumo:
This paper presents an overview of NTCIR-10 Cross-lingual Link Discovery (CrossLink-2) task. For the task, we continued using the evaluation framework developed for the NTCIR-9 CrossLink-1 task. Overall, recommended links were evaluated at two levels (file-to-file and anchor-to-file); and system performance was evaluated with metrics: LMAP, R-Prec and P@N.
Resumo:
Queensland University of Technology (QUT) Library offers a range of resources and services to researchers as part of their research support portfolio. This poster will present key features of two of the data management services offered by research support staff at QUT Library. The first service is QUT Research Data Finder (RDF), a product of the Australian National Data Service (ANDS) funded Metadata Stores project. RDF is a data registry (metadata repository) that aims to publicise datasets that are research outputs arising from completed QUT research projects. The second is a software and code registry, which is currently under development with the sole purpose of improving discovery of source code and software as QUT research outputs. RESEARCH DATA FINDER As an integrated metadata repository, Research Data Finder aligns with institutional sources of truth, such as QUT’s research administration system, ResearchMaster, as well as QUT’s Academic Profiles system to provide high quality data descriptions that increase awareness of, and access to, shareable research data. The repository and its workflows are designed to foster better data management practices, enhance opportunities for collaboration and research, promote cross-disciplinary research and maximise the impact of existing research data sets. SOFTWARE AND CODE REGISTRY The QUT Library software and code registry project stems from concerns amongst researchers with regards to development activities, storage, accessibility, discoverability and impact, sharing, copyright and IP ownership of software and code. As a result, the Library is developing a registry for code and software research outputs, which will use existing Research Data Finder architecture. The underpinning software for both registries is VIVO, open source software developed by Cornell University. The registry will use the Research Data Finder service instance of VIVO and will include a searchable interface, links to code/software locations and metadata feeds to Research Data Australia. Key benefits of the project include:improving the discoverability and reuse of QUT researchers’ code and software amongst QUT and the QUT research community; increasing the profile of QUT research outputs on a national level by providing a metadata feed to Research Data Australia, and; improving the metrics for access and reuse of code and software in the repository.
Resumo:
Background Illumina's Infinium SNP BeadChips are extensively used in both small and large-scale genetic studies. A fundamental step in any analysis is the processing of raw allele A and allele B intensities from each SNP into genotype calls (AA, AB, BB). Various algorithms which make use of different statistical models are available for this task. We compare four methods (GenCall, Illuminus, GenoSNP and CRLMM) on data where the true genotypes are known in advance and data from a recently published genome-wide association study. Results In general, differences in accuracy are relatively small between the methods evaluated, although CRLMM and GenoSNP were found to consistently outperform GenCall. The performance of Illuminus is heavily dependent on sample size, with lower no call rates and improved accuracy as the number of samples available increases. For X chromosome SNPs, methods with sex-dependent models (Illuminus, CRLMM) perform better than methods which ignore gender information (GenCall, GenoSNP). We observe that CRLMM and GenoSNP are more accurate at calling SNPs with low minor allele frequency than GenCall or Illuminus. The sample quality metrics from each of the four methods were found to have a high level of agreement at flagging samples with unusual signal characteristics. Conclusions CRLMM, GenoSNP and GenCall can be applied with confidence in studies of any size, as their performance was shown to be invariant to the number of samples available. Illuminus on the other hand requires a larger number of samples to achieve comparable levels of accuracy and its use in smaller studies (50 or fewer individuals) is not recommended.
Resumo:
Generally, the magnitude of pollutant emissions from diesel engines running on biodiesel fuel is ultimately coupled to the structure of respective molecules that constitutes the fuel. Previous studies demonstrated the relationship between organic fraction of PM and its oxidative potential. Herein, emissions from a diesel engine running on different biofuels were analysed in more detail to explore the role different organic fractions play in the measured oxidative potential. In this work, a more detailed chemical analysis of biofuel PM was undertaken using a compact Time of Flight Aerosol Mass Spectrometer (c-ToF AMS). This enabled a better identification of the different organic fractions that contribute to the overall measured oxidative potentials. The concentration of reactive oxygen species (ROS) was measured using a profluorescent nitroxide molecular probe 9-(1,1,3,3-tetramethylisoindolin-2-yloxyl-5-ethynyl)-10-(phenylethynyl)anthracene (BPEAnit). Therefore the oxidative potential of the PM, measured through the ROS content, although proportional to the total organic content in certain cases shows a much higher correlation with the oxygenated organic fraction as measured by the c-ToF AMS. This highlights the importance of knowing the surface chemistry of particles for assessing their health impacts. It also sheds light onto new aspects of particulate emissions that should be taken into account when establishing relevant metrics for assessing health implications of replacing diesel with alternative fuels.
Resumo:
Lean strategies have been developed to eliminate or reduce manufacturing waste and thus improve operational efficiency in manufacturing processes. However, implementing lean strategies requires a large amount of resources and, in practice, manufacturers encounter difficulties in selecting appropriate lean strategies within their resource constraints. There is currently no systematic methodology available for selecting appropriate lean strategies within a manufacturer's resource constraints. In the lean transformation process, it is also critical to measure the current and desired leanness levels in order to clearly evaluate lean implementation efforts. Despite the fact that many lean strategies are utilized to reduce or eliminate manufacturing waste, little effort has been directed towards properly assessing the leanness of manufacturing organizations. In practice, a single or specific group of metrics (either qualitative or quantitative) will only partially measure the overall leanness. Existing leanness assessment methodologies do not offer a comprehensive evaluation method, integrating both quantitative and qualitative lean measures into a single quantitative value for measuring the overall leanness of an organization. This research aims to develop mathematical models and a systematic methodology for selecting appropriate lean strategies and evaluating the leanness levels in manufacturing organizations. Mathematical models were formulated and a methodology was developed for selecting appropriate lean strategies within manufacturers' limited amount of available resources to reduce their identified wastes. A leanness assessment model was developed by using the fuzzy concept to assess the leanness level and to recommend an optimum leanness value for a manufacturing organization. In the proposed leanness assessment model, both quantitative and qualitative input factors have been taken into account. Based on program developed in MATLAB and C#, a decision support tool (DST) was developed for decision makers to select lean strategies and evaluate the leanness value based on the proposed models and methodology hence sustain the lean implementation efforts. A case study was conducted to demonstrate the effectiveness of these proposed models and methodology. Case study results suggested that out of 10 wastes identified, the case organization (ABC Limited) is able to improve a maximum of six wastes from the selected workstation within their resource limitations. The selected wastes are: unnecessary motion, setup time, unnecessary transportation, inappropriate processing, work in process and raw material inventory and suggested lean strategies are: 5S, Just-In-Time, Kanban System, the Visual Management System (VMS), Cellular Manufacturing, Standard Work Process using method-time measurement (MTM), and Single Minute Exchange of Die (SMED). From the suggested lean strategies, the impact of 5S was demonstrated by measuring the leanness level of two different situations in ABC. After that, MTM was suggested as a standard work process for further improvement of the current leanness value. The initial status of the organization showed a leanness value of 0.12. By applying 5S, the leanness level significantly improved to reach 0.19 and the simulation of MTM as a standard work method shows the leanness value could be improved to 0.31. The optimum leanness value of ABC was calculated to be 0.64. These leanness values provided a quantitative indication of the impacts of improvement initiatives in terms of the overall leanness level to the case organization. Sensitivity analsysis and a t-test were also performed to validate the model proposed. This research advances the current knowledge base by developing mathematical models and methodologies to overcome lean strategy selection and leanness assessment problems. By selecting appropriate lean strategies, a manufacturer can better prioritize implementation efforts and resources to maximize the benefits of implementing lean strategies in their organization. The leanness index is used to evaluate an organization's current (before lean implementation) leanness state against the state after lean implementation and to establish benchmarking (the optimum leanness state). Hence, this research provides a continuous improvement tool for a lean manufacturing organization.
Resumo:
To date, available literature mainly discusses Twitter activity patterns in the context of individual case studies, while comparative research on a large number of communicative events, their dynamics and patterns is missing. By conducting a comparative study of more than forty different cases (covering topics such as elections, natural disasters, corporate crises, and televised events) we identify a number of distinct types of discussion which can be observed on Twitter. Drawing on a range of communicative metrics, we show that thematic and contextual factors influence the usage of different communicative tools available to Twitter users, such as original tweets, @replies, retweets, and URLs. Based on this first analysis of the overall metrics of Twitter discussions, we also demonstrate stable patterns in the use of Twitter in the context of major topics and events.
Resumo:
A graph theoretic approach is developed for accurately computing haulage costs in earthwork projects. This is vital as haulage is a predominant factor in the real cost of earthworks. A variety of metrics can be used in our approach, but a fuel consumption proxy is recommended. This approach is novel as it considers the constantly changing terrain that results from cutting and filling activities and replaces inaccurate “static” calculations that have been used previously. The approach is also capable of efficiently correcting the violation of top down cutting and bottom up filling conditions that can be found in existing earthwork assignments and sequences. This approach assumes that the project site is partitioned into uniform blocks. A directed graph is then utilised to describe the terrain surface. This digraph is altered after each cut and fill, in order to reflect the true state of the terrain. A shortest path algorithm is successively applied to calculate the cost of each haul and these costs are summed to provide a total cost of haulage
Resumo:
Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.
Resumo:
Daylight devices are important components of any climate responsive façade system. But, the evolution of parametric CAD systems and digital fabrication has had an impact on architectural form so that regular forms are shifting to complex geometries. Architectural and engineering integration of daylight devices in envelopes with complex geometries is a challenge in terms of design and performance evaluation. The purpose of this paper is to assess daylight performance of a building with a climatic responsive envelope with complex geometry that integrates shading devices in the façade. The case study is based on the Esplanade buildings in Singapore. Climate-based day-light metrics such as Daylight Availability and Useful Daylight Illuminance are used. DIVA (daylight simulation), and Grasshopper (parametric analysis) plug-ins for Rhinoceros have been employed to examine the range of performance possibilities. Parameters such as dimension, inclination of the device, projected shadows and shape have been changed in order to maximize daylight availability and Useful Daylight Illuminance while minimizing glare probability. While orientation did not have a great impact on the results, aperture of the shading devices did, showing that shading devices with a projection of 1.75 m to 2.00 m performed best, achieving target lighting levels without issues of glare.
Resumo:
We present a mini-review of the development and contemporary applications of diffusion-sensitive nuclear magnetic resonance (NMR) techniques in biomedical sciences. Molecular diffusion is a fundamental physical phenomenon present in all biological systems. Due to the connection between experimentally measured diffusion metrics and the microscopic environment sensed by the diffusing molecules, diffusion measurements can be used for characterisation of molecular size, molecular binding and association, and the morphology of biological tissues. The emergence of magnetic resonance was instrumental to the development of biomedical applications of diffusion. We discuss the fundamental physical principles of diffusion NMR spectroscopy and diffusion MR imaging. The emphasis is placed on conceptual understanding, historical evolution and practical applications rather than complex technical details. Mathematical description of diffusion is presented to the extent that it is required for the basic understanding of the concepts. We present a wide range of spectroscopic and imaging applications of diffusion magnetic resonance, including colloidal drug delivery vehicles; protein association; characterisation of cell morphology; neural fibre tractography; cardiac imaging; and the imaging of load-bearing connective tissues. This paper is intended as an accessible introduction into the exciting and growing field of diffusion magnetic resonance.
Resumo:
IEEE 802.11p is the new standard for Inter-Vehicular Communications (IVC) using the 5.9 GHz frequency band, as part of the DSRC framework; it will enable applications based on Cooperative Systems. Simulation is widely used to estimate or verify the potential benefits of such cooperative applications, notably in terms of safety for the drivers. We have developed a performance model for 802.11p that can be used by simulations of cooperative applications (e.g. collision avoidance) without requiring intricate models of the whole IVC stack. Instead, it provide a a straightforward yet realistic modelisation of IVC performance. Our model uses data from extensive field trials to infer the correlation between speed, distance and performance metrics such as maximum range, latency and frame loss. Then, we improve this model to limit the number of profiles that have to be generated when there are more than a few couples of emitter-receptor in a given location. Our model generates realistic performance for rural or suburban environments among small groups of IVC-equipped vehicles and road side units.