904 resultados para Costing methodologies
Resumo:
Understanding the mechanism by which an unfolded polypeptide chain folds to its unique, functional structure is a primary unsolved problem in biochemistry. Fundamental advances towards understanding how proteins fold have come from kinetic studies, Kinetic studies allow the dissection of the folding pathway of a protein into individual steps that are defined by partially-structured folding intermediates. Improvements in both the structural and temporal resolution of physical methods that are used to monitor the folding process, as well as the development of new methodologies, are now making it possible to obtain detailed structural information on protein folding pathways. The protein engineering methodology has been particularly useful in characterizing the structures of folding intermediates as well as the transition state of folding, Several characteristics of protein folding pathways have begun to emerge as general features for the folding of many different proteins. Progress in our understanding of how structure develops during folding is reviewed here.
Resumo:
Statistical reports of SMEs Internet usage from various countries indicate a steady growth. However, deeper investigation of SME’s e-commerce adoption and usage reveals that a number of SMEs fail to realize the full potential of e-commerce. Factors such as lack of tools and models in Information Systems and Information Technology for SMEs, and lack of technical expertise and specialized knowledge within and outside the SME have the most effect. This study aims to address the two important factors in two steps. First, introduce the conceptual tool for intuitive interaction. Second, explain the implementation process of the conceptual tool with the help of a case study. The subject chosen for the case study is a real estate SME from India. The design and development process of the website for the real estate SME was captured in this case study and the duration of the study was four months. Results indicated specific benefits for web designers and SME business owners. Results also indicated that the conceptual tool is easy to use without the need for technical expertise and specialized knowledge.
Resumo:
Background: Tuberculosis still remains one of the largest killer infectious diseases, warranting the identification of newer targets and drugs. Identification and validation of appropriate targets for designing drugs are critical steps in drug discovery, which are at present major bottle-necks. A majority of drugs in current clinical use for many diseases have been designed without the knowledge of the targets, perhaps because standard methodologies to identify such targets in a high-throughput fashion do not really exist. With different kinds of 'omics' data that are now available, computational approaches can be powerful means of obtaining short-lists of possible targets for further experimental validation. Results: We report a comprehensive in silico target identification pipeline, targetTB, for Mycobacterium tuberculosis. The pipeline incorporates a network analysis of the protein-protein interactome, a flux balance analysis of the reactome, experimentally derived phenotype essentiality data, sequence analyses and a structural assessment of targetability, using novel algorithms recently developed by us. Using flux balance analysis and network analysis, proteins critical for survival of M. tuberculosis are first identified, followed by comparative genomics with the host, finally incorporating a novel structural analysis of the binding sites to assess the feasibility of a protein as a target. Further analyses include correlation with expression data and non-similarity to gut flora proteins as well as 'anti-targets' in the host, leading to the identification of 451 high-confidence targets. Through phylogenetic profiling against 228 pathogen genomes, shortlisted targets have been further explored to identify broad-spectrum antibiotic targets, while also identifying those specific to tuberculosis. Targets that address mycobacterial persistence and drug resistance mechanisms are also analysed. Conclusion: The pipeline developed provides rational schema for drug target identification that are likely to have high rates of success, which is expected to save enormous amounts of money, resources and time in the drug discovery process. A thorough comparison with previously suggested targets in the literature demonstrates the usefulness of the integrated approach used in our study, highlighting the importance of systems-level analyses in particular. The method has the potential to be used as a general strategy for target identification and validation and hence significantly impact most drug discovery programmes.
Resumo:
The purpose of this article is to show the applicability and benefits of the techniques of design of experiments as an optimization tool for discrete simulation models. The simulated systems are computational representations of real-life systems; its characteristics include a constant evolution that follows the occurrence of discrete events along the time. In this study, a production system, designed with the business philosophy JIT (Just in Time) is used, which seeks to achieve excellence in organizations through waste reduction in all the operational aspects. The most typical tool of JIT systems is the KANBAN production control that seeks to synchronize demand with flow of materials, minimize work in process, and define production metrics. Using experimental design techniques for stochastic optimization, the impact of the operational factors on the efficiency of the KANBAN / CONWIP simulation model is analyzed. The results show the effectiveness of the integration of experimental design techniques and discrete simulation models in the calculation of the operational parameters. Furthermore, the reliability of the methodologies found was improved with a new statistical consideration.
Resumo:
This paper presents a Hamiltonian model of marine vehicle dynamics in six degrees of freedom in both body-fixed and inertial momentum coordinates. The model in body-fixed coordinates presents a particular structure of the mass matrix that allows the adaptation and application of passivity-based control interconnection and damping assignment design methodologies developed for robust stabilisation of mechanical systems in terms of generalised coordinates. As an example of application, we follow this methodology to design a passivity-based tracking controller with integral action for fully actuated vehicles in six degrees of freedom. We also describe a momentum transformation that allows an alternative model representation that resembles general port-Hamiltonian mechanical systems with a coordinate dependent mass matrix. This can be seen as an enabling step towards the adaptation of the theory of control of port-Hamiltonian systems developed in robotic manipulators and multi-body mechanical systems to the case of marine craft dynamics.
Resumo:
This paper presents three methodologies for determining optimum locations and magnitudes of reactive power compensation in power distribution systems. Method I and Method II are suitable for complex distribution systems with a combination of both radial and ring-main feeders and having different voltage levels. Method III is suitable for low-tension single voltage level radial feeders. Method I is based on an iterative scheme with successive powerflow analyses, with formulation and solution of the optimization problem using linear programming. Method II and Method III are essentially based on the steady state performance of distribution systems. These methods are simple to implement and yield satisfactory results comparable with the results of Method I. The proposed methods have been applied to a few distribution systems, and results obtained for two typical systems are presented for illustration purposes.
Resumo:
As the use of Twitter has become more commonplace throughout many nations, its role in political discussion has also increased. This has been evident in contexts ranging from general political discussion through local, state, and national elections (such as in the 2010 Australian elections) to protests and other activist mobilisation (for example in the current uprisings in Tunisia, Egypt, and Yemen, as well as in the controversy around Wikileaks). Research into the use of Twitter in such political contexts has also developed rapidly, aided by substantial advancements in quantitative and qualitative methodologies for capturing, processing, analysing, and visualising Twitter updates by large groups of users. Recent work has especially highlighted the role of the Twitter hashtag – a short keyword, prefixed with the hash symbol ‘#’ – as a means of coordinating a distributed discussion between more or less large groups of users, who do not need to be connected through existing ‘follower’ networks. Twitter hashtags – such as ‘#ausvotes’ for the 2010 Australian elections, ‘#londonriots’ for the coordination of information and political debates around the recent unrest in London, or ‘#wikileaks’ for the controversy around Wikileaks thus aid the formation of ad hoc publics around specific themes and topics. They emerge from within the Twitter community – sometimes as a result of pre-planning or quickly reached consensus, sometimes through protracted debate about what the appropriate hashtag for an event or topic should be (which may also lead to the formation of competing publics using different hashtags). Drawing on innovative methodologies for the study of Twitter content, this paper examines the use of hashtags in political debate in the context of a number of major case studies.
Resumo:
The thesis focuses on the social interaction and behavior of the homeless living in Tokyo's Taito Ward. The study is based on the author's own ethnographic field research carried out in the autumn 2003. The chosen methodologies were based on the methodology called "participant observation", and they were used depending on the context. The ethnographic field research was carried out from the mid-August to the beginning of the October in 2003. The most important targets of the research were three separate loosely knit groups placed in certain parts of Taito Ward. One of these groups was based in proximity to the Ueno train station, one group gathered every morning around a homeless support organization called San'yûkai, and one was based in Tamahime Park located in the old San'ya area of Tokyo. The analysis is based on the aspects of Takie Sugiyama Lebra's theory of "social relativism". Lebra's theory consists of the following, arguably universal aspects: belongingness, empathy, dependence, place in the society, and reciprocity. In addition, all the interaction and behavior is tied to the context and the situation. According to Lebra, ritual and intimate situations produce similar action, which is socially relative. Of these, the norms of the ritual behavior are more regulated, while the intimate bahavior is less spontaneous. On the contrary, an anomic situation produces anomic behavior, which is not socially relative. Lebra's theory is critically reviewed by the author of the thesis, and the author has attempted to modify the theory to make it more adaptable to the present-day society and to the analysis. Erving Goffman's views of the social interaction and Anthony Giddens' theories about the social structures have been used as complementary thoretical basis. The aim of the thesis is to clarify, how and why the interaction and the behavior of some homeless individuals in some situations follow the aspects of Lebra's "social relativism", and on the other hand, why in some situations they do not. In the latter cases the answers can be sought from regional and individual differences, or from the inaptness of the theory to analyze the presented situation. Here, a significant factor is the major finding of the field study: the so called "homeless etiquette", which is an abstract set of norms and values that influences the social interaction and behavior of the homeless, and with which many homeless individuals presented in the study complied. The fundamental goal of the thesis is to reach profound understanding about the daily life of the homeless, whose lives were studied. The author argues that this kind of profound understanding is necessary in looking for sustainable solutions in the areas of social and housing policy to improve the position of the homeless and the qualitative functioning of the society.
Resumo:
The intent of this study was to design, document and implement a Quality Management System (QMS) into a laboratory that incorporated both research and development (R&D) and routine analytical activities. In addition, it was necessary for the QMS to be easily and efficiently maintained to: (a) provide documented evidence that would validate the system's compliance with a certifiable standard, (b) fit the purpose of the laboratory, (c) accommodate prevailing government policies and standards, and (d) promote positive outcomes for the laboratory through documentation and verification of the procedures and methodologies implemented. Initially, a matrix was developed that documented the standards' requirements and the necessary steps to be made to meet those requirements. The matrix provided a check mechanism on the progression of the system's development. In addition, it was later utilised in the Quality Manual as a reference tool for the location of full procedures documented elsewhere in the system. The necessary documentation to build and monitor the system consisted of a series of manuals along with forms that provided auditable evidence of the workings of the QMS. Quality Management (QM), in one form or another, has been in existence since the early 1900's. However, the question still remains: is it a good thing or just a bugbear? Many of the older style systems failed because they were designed by non-users, fiercely regulatory, restrictive and generally deemed to be an imposition. It is now considered important to foster a sense of ownership of the system by the people who use the system. The system's design must be tailored to best fit the purpose of the operations of the facility if maximum benefits to the organisation are to be gained.
Resumo:
Using DNA markers in plant breeding with marker-assisted selection (MAS) could greatly improve the precision and efficiency of selection, leading to the accelerated development of new crop varieties. The numerous examples of MAS in rice have prompted many breeding institutes to establish molecular breeding labs. The last decade has produced an enormous amount of genomics research in rice, including the identification of thousands of QTLs for agronomically important traits, the generation of large amounts of gene expression data, and cloning and characterization of new genes, including the detection of single nucleotide polymorphisms. The pinnacle of genomics research has been the completion and annotation of genome sequences for indica and japonica rice. This information-coupled with the development of new genotyping methodologies and platforms, and the development of bioinformatics databases and software tools-provides even more exciting opportunities for rice molecular breeding in the 21st century. However, the great challenge for molecular breeders is to apply genomics data in actual breeding programs. Here, we review the current status of MAS in rice, current genomics projects and promising new genotyping methodologies, and evaluate the probable impact of genomics research. We also identify critical research areas to "bridge the application gap" between QTL identification and applied breeding that need to be addressed to realize the full potential of MAS, and propose ideas and guidelines for establishing rice molecular breeding labs in the postgenome sequence era to integrate molecular breeding within the context of overall rice breeding and research programs.
Resumo:
Objective To improve the isolation rate and identification procedures for Haemophilus parasuis from pig tissues. Design Thirteen sampling sites and up to three methods were used to confirm the presence of H. parasuis in pigs after experimental challenge. Procedure Colostrum-deprived, naturally farrowed pigs were challenged intratracheally with H parasuis serovar 12 or 4. Samples taken during necropsy were either inoculated onto culture plates, processed directly for PCR or enriched prior to being processed for PCR. The recovery of H parasuis from different sampling sites and using different sampling methods was compared for each serovar. Results H parasuis was recovered from several sample sites for all serovar 12 challenged pigs, while the trachea was the only positive site for all pigs following serovar 4 challenge. The method of solid medium culture of swabs, and confirmation of the identity of cultured bacteria by PCR, resulted in 38% and 14% more positive results on a site basis for serovars 12 and 4, retrospectively, than direct PCR on the swabs. This difference was significant in the serovar 12 challenge. Conclusion Conventional culture proved to be more effective in detecting H parasuis than direct PCR or PCR on enrichment broths. For subacute (serovar 4) infections, the most successful sites for culture or direct PCR were pleural fluid, peritoneal fibrin and fluid, lung and pericardial fluid. For acute (serovar 12) infections, the best sites were lung, heart blood, affected joints and brain. The methodologies and key sampling sites identified in this study will enable improved isolation of H parasuis and aid the diagnosis of Glässer's disease.
Resumo:
Knowledge of drag force is an important design parameter in aerodynamics. Measurement of aerodynamic forces at hypersonic speed is a challenge and usually ground test facilities like shock tunnels are used to carry out such tests. Accelerometer based force balances are commonly employed for measuring aerodynamic drag around bodies in hypersonic shock tunnels. In this study, we present an analysis of the effect of model material on the performance of an accelerometer balance used for measurement of drag in impulse facilities. From the experimental studies performed on models constructed out of Bakelite HYLEM and Aluminum, it is clear that the rigid body assumption does not hold good during the short testing duration available in shock tunnels. This is notwithstanding the fact that the rubber bush used for supporting the model allows unconstrained motion of the model during the short testing time available in the shock tunnel. The vibrations induced in the model on impact loading in the shock tunnel are damped out in metallic model, resulting in a smooth acceleration signal, while the signal become noisy and non-linear when we use non-isotropic materials like Bakelite HYLEM. This also implies that careful analysis and proper data reduction methodologies are necessary for measuring aerodynamic drag for non-metallic models in shock tunnels. The results from the drag measurements carried out using a 60 degrees half angle blunt cone is given in the present analysis.
Resumo:
Purpose A retrospective planning study comparing volumetric arc therapy (VMAT) and stereotactic body radiotherapy (SBRT) treatment plans for non-small cell lung cancer (NSCLC). Methods and materials Five randomly selected early stage lung cancer patients were included in the study. For each patient, four plans were created: the SBRT plan and three VMAT plans using different optimisation methodologies. A total of 20 different plans were evaluated. The dose parameters of dose conformity results and the target dose constraints results were compared for these plans. Results The mean planning target volume (PTV) for all the plans (SBRT and VMAT) was 18·3 cm3, with a range from 15·6 to 20·1 cm3. The maximum dose tolerance to 1 cc of all the plans was within 140% (84 Gy) of the prescribed dose, and 95% of the PTV of all the plans received 100% of the prescribed dose (60 Gy). In all the plans, 99% of the PTV received a dose >90% of the prescribed dose, and the mean dose in all the plans ranged from 67 to 72 Gy. The planning target dose conformity for the SBRT and the VMAT (0°, 15° collimator single arc plans and dual arc) plans showed the tightness of the prescription isodose conformity to the target. Conclusions SBRT and VMAT are radiotherapy approaches that increase doses to small tumour targets without increasing doses to the organs at risk. Although VMAT offers an alternative to SBRT for NSCLC and the potential advantage of VMAT is the reduced treatment times over SBRT, the statistical results show that there was no significant difference between the SBRT and VMAT optimised plans in terms of dose conformity and organ-at-risk sparing.
Resumo:
Aerial surveys of kangaroos (Macropus spp.) in Queensland are used to make economically important judgements on the levels of viable commercial harvest. Previous analysis methods for aerial kangaroo surveys have used both mark-recapture methodologies and conventional distance-sampling analyses. Conventional distance sampling has the disadvantage that detection is assumed to be perfect on the transect line, while mark-recapture methods are notoriously sensitive to problems with unmodelled heterogeneity in capture probabilities. We introduce three methodologies for combining together mark-recapture and distance-sampling data, aimed at exploiting the strengths of both methodologies and overcoming the weaknesses. Of these methods, two are based on the assumption of full independence between observers in the mark-recapture component, and this appears to introduce more bias in density estimation than it resolves through allowing uncertain trackline detection. Both of these methods give lower density estimates than conventional distance sampling, indicating a clear failure of the independence assumption. The third method, termed point independence, appears to perform very well, giving credible density estimates and good properties in terms of goodness-of-fit and percentage coefficient of variation. Estimated densities of eastern grey kangaroos range from 21 to 36 individuals km-2, with estimated coefficients of variation between 11% and 14% and estimated trackline detection probabilities primarily between 0.7 and 0.9.
Resumo:
To remain competitive, many agricultural systems are now being run along business lines. Systems methodologies are being incorporated, and here evolutionary computation is a valuable tool for identifying more profitable or sustainable solutions. However, agricultural models typically pose some of the more challenging problems for optimisation. This chapter outlines these problems, and then presents a series of three case studies demonstrating how they can be overcome in practice. Firstly, increasingly complex models of Australian livestock enterprises show that evolutionary computation is the only viable optimisation method for these large and difficult problems. On-going research is taking a notably efficient and robust variant, differential evolution, out into real-world systems. Next, models of cropping systems in Australia demonstrate the challenge of dealing with competing objectives, namely maximising farm profit whilst minimising resource degradation. Pareto methods are used to illustrate this trade-off, and these results have proved to be most useful for farm managers in this industry. Finally, land-use planning in the Netherlands demonstrates the size and spatial complexity of real-world problems. Here, GIS-based optimisation techniques are integrated with Pareto methods, producing better solutions which were acceptable to the competing organizations. These three studies all show that evolutionary computation remains the only feasible method for the optimisation of large, complex agricultural problems. An extra benefit is that the resultant population of candidate solutions illustrates trade-offs, and this leads to more informed discussions and better education of the industry decision-makers.