531 resultados para Value Stream Mapping
Resumo:
Associations between single nucleotide polymorphisms (SNPs) at 5p15 and multiple cancer types have been reported. We have previously shown evidence for a strong association between prostate cancer (PrCa) risk and rs2242652 at 5p15, intronic in the telomerase reverse transcriptase (TERT) gene that encodes TERT. To comprehensively evaluate the association between genetic variation across this region and PrCa, we performed a fine-mapping analysis by genotyping 134 SNPs using a custom Illumina iSelect array or Sequenom MassArray iPlex, followed by imputation of 1094 SNPs in 22 301 PrCa cases and 22 320 controls in The PRACTICAL consortium. Multiple stepwise logistic regression analysis identified four signals in the promoter or intronic regions of TERT that independently associated with PrCa risk. Gene expression analysis of normal prostate tissue showed evidence that SNPs within one of these regions also associated with TERT expression, providing a potential mechanism for predisposition to disease.
Resumo:
This paper argues that, despite its strengths, the UK Department of Culture, Media and Sport (DCMS) classification of the creative industries contains inconsistencies which need to be addressed to make it fully fit for purpose. It presents an improved methodology which retains the strengths of the DCMS's approach while addressing its deficiencies. We focus on creative intensity: the proportion of total employment within an industry that is engaged in creative occupations.
Resumo:
Sfinks is a shift register based stream cipher designed for hardware implementation and submitted to the eSTREAM project. In this paper, we analyse the initialisation process of Sfinks. We demonstrate a slid property of the loaded state of the Sfinks cipher, where multiple key-IV pairs may produce phase shifted keystream sequences. The state update functions of both the initialisation process and keystream generation and also the pattern of the padding affect generation of the slid pairs.
Resumo:
Value Management (VM) is a proven methodology that provides a structured framework using supporting tools and techniques that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. It offers an exceptionally robust approach to exploring the need and function of projects to be aligned with client’s objectives. The functional analysis and creativity phases of VM are crucial as it focused on utilising innovative thinking to understand the objectives of clients’ projects and provide value-adding solutions at the early discovery stages of projects. There is however a perception of VM as just being another cost-cutting tool, which has overshadowed the fundamental benefits of the method, therefore negating both influence and wider use in the construction industry. This paper describes findings from a series of case studies conducted at project and corporate levels of a current public funded infrastructure projects in Malaysia. The study aims to investigate VM processes practised by the project client organisation and evaluate the effects of project team involvement in VM workshops during the design-stage of these projects. The focus of the study is on how issues related to ‘upstream’ infrastructure design aimed at improving ‘downstream’ construction process on-site, are being resolved through multi-disciplinary team consideration and decision-making. Findings from the case studies indicate that the mix of disciplines of project team members at a design-stage of a VM workshop has minimal influence on improving construction processes. However, the degree of interaction, institutionalized thinking, cultural dimensions and visualization aids adopted, have a significant impact in maximizing creativity amongst project team members during VM workshop. The case studies conducted for this research have focused on infrastructure projects that utilise traditional VM workshop as client’s chosen VM methodology to review and develop designs. Documents review and semi-structured interview with project teams are used as data collection techniques for the case study. The significant outcomes of this research are expected to offer alternative perspectives for construction professionals and clients to minimise the constraints and strengthen strategies for implementing VM on future projects.
Resumo:
The challenge of persistent appearance-based navigation and mapping is to develop an autonomous robotic vision system that can simultaneously localize, map and navigate over the lifetime of the robot. However, the computation time and memory requirements of current appearance-based methods typically scale not only with the size of the environment but also with the operation time of the platform; also, repeated revisits to locations will develop multiple competing representations which reduce recall performance. In this paper we present a solution to the persistent localization, mapping and global path planning problem in the context of a delivery robot in an office environment over a one-week period. Using a graphical appearance-based SLAM algorithm, CAT-Graph, we demonstrate constant time and memory loop closure detection with minimal degradation during repeated revisits to locations, along with topological path planning that improves over time without using a global metric representation. We compare the localization performance of CAT-Graph to openFABMAP, an appearance-only SLAM algorithm, and the path planning performance to occupancy-grid based metric SLAM. We discuss the limitations of the algorithm with regard to environment change over time and illustrate how the topological graph representation can be coupled with local movement behaviors for persistent autonomous robot navigation.
Resumo:
Well-designed initialisation and keystream generation processes for stream ciphers should ensure that each key-IV pair generates a distinct keystream. In this paper, we analyse some ciphers where this does not happen due to state convergence occurring either during initialisation, keystream generation or both. We show how state convergence occurs in each case and identify two mechanisms which can cause state convergence.
Resumo:
A5/1 is a shift register based stream cipher which uses a majority clocking rule to update its registers. It is designed to provide privacy for the GSM system. In this paper, we analyse the initialisation process of A5/1. We demonstrate a sliding property of the A5/1 cipher, where every valid internal state is also a legitimate loaded state and multiple key-IV pairs produce phase shifted keystream sequences. We describe a possible ciphertext only attack based on this property.
Resumo:
Australian child protection systems have been subject to sustained and significant criticism for many decades. As a central part of that system Children’s Courts have been implicated: three recent inquiries into the child protection system in Victoria all criticised the Family Division of the Children’s Court.1 In the resulting debate two diametrically opposed points of view surfaced about the Children’s Court and the role that legal procedures and professionals should play in child protection matters. On one side bodies like the Children’s Court of Victoria, Victoria Legal Aid (‘VLA’), the Law Institute of Victoria (‘LIV’), and the Federation of Community Legal Centres (‘FCLC’) argued that the Children’s Court plays a vital role in child protection and should continue to play that role.2 On the other side a coalition of human service and child protection agencies called for major change including the removal of the Children’s Court from the child protection system. Victoria’s Department of Human Services (‘DHS’) has been critical of the Court3 as have community sector organisations like Anglicare, Berry Street, MacKillop Family Services and the Salvation Army — all agencies the DHS funds to deliver child protection services.4 Victoria’s Child Safety Commissioner has also called for major reform, publicly labelling the Court a ‘lawyers’ playground’ and recommending abolishing the Court’s involvement in child protection completely.
Resumo:
As the financial planning industry undergoes a series of reforms aimed at increased professionalism and improved quality of advice, financial planner training in Australia and elsewhere has begun to acknowledge the importance of interdisciplinary knowledge bases in informing both curriculum design and professoinal practice (e.g. FPA2009). This paper underscores the importance of the process of financial planning by providing a conceptual analysis of the six step financial planning process using key mechanisms derived from theory and research in cognate disciplines such as psychology and well-being. The paper identifies how these mechanisms may operate to impact client well-being in the financial planning context. The conceptual mapping of th emechanisms to process elements of financial planning is a unique contribution to the financial planning literature and offers a further framework in the armamentarium of researchers interested in pursuing questions around the value of financial planning. The conceptual framework derived from the analysis also adds to the growing body of literature aimed at developing an integrated model of financial planning.
Resumo:
Chemical treatments of kaolins to produce nanocrystalline or "X-ray amorphous", stable aluminosilicates with variable - but reproducible - types of micro- and meso-porosity have been developed. These materials show cation exchange capacities and surface area values significantly higher (ranging from 10x to 100x) than kaolin and show good acid resistance to pH~3.0. The combination of these properties offers strong potential for many new applications of kaolin-derived materials in large worldwide markets such as environmental remediation and catalysis. Kaolin amorphous derivative (KAD) is well-suited to removal of many toxic metals down to ppb range from acid mine drainage. Engineering development trials of the KAD manufacturing process and the utilisation of KAD in polluted waters such as acid mine drainage indicates that scale-up from bench-scale is not a barrier to market entry.
Resumo:
Organizations from every industry sector seek to enhance their business performance and competitiveness through the deployment of contemporary information systems (IS), such as Enterprise Systems (ERP). Investments in ERP are complex and costly, attracting scrutiny and pressure to justify their cost. Thus, IS researchers highlight the need for systematic evaluation of information system success, or impact, which has resulted in the introduction of varied models for evaluating information systems. One of these systematic measurement approaches is the IS-Impact Model introduced by a team of researchers at Queensland University of technology (QUT) (Gable, Sedera, & Chan, 2008). The IS-Impact Model is conceptualized as a formative, multidimensional index that consists of four dimensions. Gable et al. (2008) define IS-Impact as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (p.381). The IT Evaluation Research Program (ITE-Program) at QUT has grown the IS-Impact Research Track with the central goal of conducting further studies to enhance and extend the IS-Impact Model. The overall goal of the IS-Impact research track at QUT is "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable, 2009). In order to achieve that, the IS-Impact research track advocates programmatic research having the principles of tenacity, holism, and generalizability through extension research strategies. This study was conducted within the IS-Impact Research Track, to further generalize the IS-Impact Model by extending it to the Saudi Arabian context. According to Hofsted (2012), the national culture of Saudi Arabia is significantly different from the Australian national culture making the Saudi Arabian culture an interesting context for testing the external validity of the IS-Impact Model. The study re-visits the IS-Impact Model from the ground up. Rather than assume the existing instrument is valid in the new context, or simply assess its validity through quantitative data collection, the study takes a qualitative, inductive approach to re-assessing the necessity and completeness of existing dimensions and measures. This is done in two phases: Exploratory Phase and Confirmatory Phase. The exploratory phase addresses the first research question of the study "Is the IS-Impact Model complete and able to capture the impact of information systems in Saudi Arabian Organization?". The content analysis, used to analyze the Identification Survey data, indicated that 2 of the 37 measures of the IS-Impact Model are not applicable for the Saudi Arabian Context. Moreover, no new measures or dimensions were identified, evidencing the completeness and content validity of the IS-Impact Model. In addition, the Identification Survey data suggested several concepts related to IS-Impact, the most prominent of which was "Computer Network Quality" (CNQ). The literature supported the existence of a theoretical link between IS-Impact and CNQ (CNQ is viewed as an antecedent of IS-Impact). With the primary goal of validating the IS-Impact model within its extended nomological network, CNQ was introduced to the research model. The Confirmatory Phase addresses the second research question of the study "Is the Extended IS-Impact Model Valid as a Hierarchical Multidimensional Formative Measurement Model?". The objective of the Confirmatory Phase was to test the validity of IS-Impact Model and CNQ Model. To achieve that, IS-Impact, CNQ, and IS-Satisfaction were operationalized in a survey instrument, and then the research model was assessed by employing the Partial Least Squares (PLS) approach. The CNQ model was validated as a formative model. Similarly, the IS-Impact Model was validated as a hierarchical multidimensional formative construct. However, the analysis indicated that one of the IS-Impact Model indicators was insignificant and can be removed from the model. Thus, the resulting Extended IS-Impact Model consists of 4 dimensions and 34 measures. Finally, the structural model was also assessed against two aspects: explanatory and predictive power. The analysis revealed that the path coefficient between CNQ and IS-Impact is significant with t-value= (4.826) and relatively strong with â = (0.426) with CNQ explaining 18% of the variance in IS-Impact. These results supported the hypothesis that CNQ is antecedent of IS-Impact. The study demonstrates that the quality of Computer Network affects the quality of the Enterprise System (ERP) and consequently the impacts of the system. Therefore, practitioners should pay attention to the Computer Network quality. Similarly, the path coefficient between IS-Impact and IS-Satisfaction was significant t-value = (17.79) and strong â = (0.744), with IS-Impact alone explaining 55% of the variance in Satisfaction, consistent with results of the original IS-Impact study (Gable et al., 2008). The research contributions include: (a) supporting the completeness and validity of IS-Impact Model as a Hierarchical Multi-dimensional Formative Measurement Model in the Saudi Arabian context, (b) operationalizing Computer Network Quality as conceptualized in the ITU-T Recommendation E.800 (ITU-T, 1993), (c) validating CNQ as a formative measurement model and as an antecedent of IS Impact, and (d) conceptualizing and validating IS-Satisfaction as a reflective measurement model and as an immediate consequence of IS Impact. The CNQ model provides a framework to perceptually measure Computer Network Quality from multiple perspectives. The CNQ model features an easy-to-understand, easy-to-use, and economical survey instrument.
Resumo:
Introduction: Undergraduate students studying the Bachelor of Radiation Therapy at Queensland University of Technology (QUT) attend clinical placements in a number of department sites across Queensland. To ensure that the curriculum prepares students for the most common treatments and current techniques in use in these departments, a curriculum matching exercise was performed. Methods: A cross-sectional census was performed on a pre-determined “Snapshot” date in 2012. This was undertaken by the clinical education staff in each department who used a standardized proforma to count the number of patients as well as prescription, equipment, and technique data for a list of tumour site categories. This information was combined into aggregate anonymized data. Results: All 12 Queensland radiation therapy clinical sites participated in the Snapshot data collection exercise to produce a comprehensive overview of clinical practice on the chosen day. A total of 59 different tumour sites were treated on the chosen day and as expected the most common treatment sites were prostate and breast, comprising 46% of patients treated. Data analysis also indicated that intensity-modulated radiotherapy (IMRT) use is relatively high with 19.6% of patients receiving IMRT treatment on the chosen day. Both IMRT and image-guided radiotherapy (IGRT) indications matched recommendations from the evidence. Conclusion: The Snapshot method proved to be a feasible and efficient method of gathering useful
Resumo:
Aims This research sought to determine optimal corn waste stream–based fermentation medium C and N sources and incubation time to maximize pigment production by an indigenous Indonesian Penicillium spp., as well as to assess pigment pH stability. Methods and Results A Penicillium spp. was isolated from Indonesian soil, identified as Penicillium resticulosum, and used to test the effects of carbon and nitrogen type and concentrations, medium pH, incubation period and furfural on biomass and pigment yield (PY) in a waste corncob hydrolysate basal medium. Maximum red PY (497·03 ± 55·13 mg l−1) was obtained with a 21 : 1 C : N ratio, pH 5·5–6·0; yeast extract-, NH4NO3-, NaNO3-, MgSO4·7H2O-, xylose- or carboxymethylcellulose (CMC)-supplemented medium and 12 days (25°C, 60–70% relative humidity, dark) incubation. C source, C, N and furfural concentration, medium pH and incubation period all influenced biomass and PY. Pigment was pH 2–9 stable. Conclusions Penicillium resticulosum demonstrated microbial pH-stable-pigment production potential using a xylose or CMC and N source, supplemented waste stream cellulose culture medium. Significance and Impact of the Study Corn derived, waste stream cellulose can be used as a culture medium for fungal pigment production. Such application provides a process for agricultural waste stream resource reuse for production of compounds in increasing demand.
Resumo:
Historical information can be used, in addition to pedigree, traits and genotypes, to map quantitative trait locus (QTL) in general populations via maximum likelihood estimation of variance components. This analysis is known as linkage disequilibrium (LD) and linkage mapping, because it exploits both linkage in families and LD at the population level. The search for QTL in the wild population of Soay sheep on St. Kilda is a proof of principle. We analysed the data from a previous study and confirmed some of the QTLs reported. The most striking result was the confirmation of a QTL affecting birth weight that had been reported using association tests but not when using linkage-based analyses. Copyright © Cambridge University Press 2010.
Resumo:
A novel multiple regression method (RM) is developed to predict identity-by-descent probabilities at a locus L (IBDL), among individuals without pedigree, given information on surrounding markers and population history. These IBDL probabilities are a function of the increase in linkage disequilibrium (LD) generated by drift in a homogeneous population over generations. Three parameters are sufficient to describe population history: effective population size (Ne), number of generations since foundation (T), and marker allele frequencies among founders (p). IBD L are used in a simulation study to map a quantitative trait locus (QTL) via variance component estimation. RM is compared to a coalescent method (CM) in terms of power and robustness of QTL detection. Differences between RM and CM are small but significant. For example, RM is more powerful than CM in dioecious populations, but not in monoecious populations. Moreover, RM is more robust than CM when marker phases are unknown or when there is complete LD among founders or Ne is wrong, and less robust when p is wrong. CM utilises all marker haplotype information, whereas RM utilises information contained in each individual marker and all possible marker pairs but not in higher order interactions. RM consists of a family of models encompassing four different population structures, and two ways of using marker information, which contrasts with the single model that must cater for all possible evolutionary scenarios in CM.