891 resultados para Design methods
Resumo:
Aim: The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for an expanded workforce of orthotist/prosthetists in Australia, competency based standards, which are up-to-date and evidence-based, are required. The aims of this study were to determine the minimum level for entry into the orthotic/prosthetic profession; to develop entry level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally, using an evidence-based approach. Methods: A mixed-methods research design was applied, using a three-step sequential exploratory design, where step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft competency standards; and step 3 involved quantitative data collection and analysis – a Delphi survey. In stage 1 (steps 1 and 2), the two focus groups – an expert and a recent graduate group of Australian orthotist/prosthetists – were led by an experienced facilitator, to identify gaps in the current competency standards and then to outline a key purpose, and work roles and tasks for the profession. The resulting domains and activities of the first draft of the competency standards were synthesized using thematic analysis. In stage 2 (step 3), the draft-competency standards were circulated to a purposive sample of the membership of the Australian Orthotic Prosthetic Association, using three rounds of Delphi survey. A project reference group of orthotist/prosthetists reviewed the results of both stages. Results: In stage 1, the expert (n = 10) and the new graduate (n = 8) groups separately identified work roles and tasks, which formed the initial draft of the competency standards. Further drafts were refined and performance criteria added by the project reference group, resulting in the final draft-competency standards. In stage 2, the final draft-competency standards were circulated to 56 members (n = 44 final round) of the Association, who agreed on the key purpose, 6 domains, 18 activities, and 68 performance criteria of the final competency standards. Conclusion: This study outlines a rigorous and evidence-based mixed-methods approach for developing and endorsing professional competency standards, which is representative of the views of the profession of orthotist/prosthetists.
Resumo:
NMR spectroscopy enables the study of biomolecules from peptides and carbohydrates to proteins at atomic resolution. The technique uniquely allows for structure determination of molecules in solution-state. It also gives insights into dynamics and intermolecular interactions important for determining biological function. Detailed molecular information is entangled in the nuclear spin states. The information can be extracted by pulse sequences designed to measure the desired molecular parameters. Advancement of pulse sequence methodology therefore plays a key role in the development of biomolecular NMR spectroscopy. A range of novel pulse sequences for solution-state NMR spectroscopy are presented in this thesis. The pulse sequences are described in relation to the molecular information they provide. The pulse sequence experiments represent several advances in NMR spectroscopy with particular emphasis on applications for proteins. Some of the novel methods are focusing on methyl-containing amino acids which are pivotal for structure determination. Methyl-specific assignment schemes are introduced for increasing the size range of 13C,15N labeled proteins amenable to structure determination without resolving to more elaborate labeling schemes. Furthermore, cost-effective means are presented for monitoring amide and methyl correlations simultaneously. Residual dipolar couplings can be applied for structure refinement as well as for studying dynamics. Accurate methods for measuring residual dipolar couplings in small proteins are devised along with special techniques applicable when proteins require high pH or high temperature solvent conditions. Finally, a new technique is demonstrated to diminish strong-coupling induced artifacts in HMBC, a routine experiment for establishing long-range correlations in unlabeled molecules. The presented experiments facilitate structural studies of biomolecules by NMR spectroscopy.
Resumo:
With Safe Design and Construction of Machinery, the author presents the results of empirical studies into this significant aspect of safety science in a very readable, well-structured format. The book contains 436 references, 17 tables, one figure and a comprehensive index. Liz Bluff addresses a complex and important, but often neglected domain in OHS – the safety of machinery – in a holistic and profound, yet evidence based analysis; with many applied cases from her studies, which make the book accessible and a pleasant lecture. Although research that led to this remarkable publication might have been primarily focused on the regulators, this book can be highly recommended to all OHS academics and practitioners. It provides an important contribution to the body of knowledge in OHS, and establishes one of the few Australian in-depth insights into the significance of machinery producers, rather than machinery users in the wider framework of risk management. The author bases this fresh perspective on the well-established European Machinery Safety guidelines, and grounds her mixed-methods research predominantly in qualitative analysis of motivation and knowledge, which eventually leads to specific safety outcomes. It should be noted that both European and Australian legal aspects are investigated and considered, as both equally apply to many machinery exporters. A detailed description of the research design and methods can be found in an appendix. Overall, the unique combination of quantitative safety performance data and qualitative analysis of safety behaviours form a valuable addition to the understanding of machinery safety. The author must be congratulated on making these complex relationships transparent to the reader through her meticulous inquiry.
Resumo:
Microarrays are high throughput biological assays that allow the screening of thousands of genes for their expression. The main idea behind microarrays is to compute for each gene a unique signal that is directly proportional to the quantity of mRNA that was hybridized on the chip. A large number of steps and errors associated with each step make the generated expression signal noisy. As a result, microarray data need to be carefully pre-processed before their analysis can be assumed to lead to reliable and biologically relevant conclusions. This thesis focuses on developing methods for improving gene signal and further utilizing this improved signal for higher level analysis. To achieve this, first, approaches for designing microarray experiments using various optimality criteria, considering both biological and technical replicates, are described. A carefully designed experiment leads to signal with low noise, as the effect of unwanted variations is minimized and the precision of the estimates of the parameters of interest are maximized. Second, a system for improving the gene signal by using three scans at varying scanner sensitivities is developed. A novel Bayesian latent intensity model is then applied on these three sets of expression values, corresponding to the three scans, to estimate the suitably calibrated true signal of genes. Third, a novel image segmentation approach that segregates the fluorescent signal from the undesired noise is developed using an additional dye, SYBR green RNA II. This technique helped in identifying signal only with respect to the hybridized DNA, and signal corresponding to dust, scratch, spilling of dye, and other noises, are avoided. Fourth, an integrated statistical model is developed, where signal correction, systematic array effects, dye effects, and differential expression, are modelled jointly as opposed to a sequential application of several methods of analysis. The methods described in here have been tested only for cDNA microarrays, but can also, with some modifications, be applied to other high-throughput technologies. Keywords: High-throughput technology, microarray, cDNA, multiple scans, Bayesian hierarchical models, image analysis, experimental design, MCMC, WinBUGS.
Resumo:
Introduction Electronic medication administration record (eMAR) systems are promoted as a potential intervention to enhance medication safety in residential aged care facilities (RACFs). The purpose of this study was to conduct an in-practice evaluation of an eMAR being piloted in one Australian RACF before its roll out, and to provide recommendations for system improvements. Methods A multidisciplinary team conducted direct observations of workflow (n=34 hours) in the RACF site and the community pharmacy. Semi-structured interviews (n=5) with RACF staff and the community pharmacist were conducted to investigate their views of the eMAR system. Data were analysed using a grounded theory approach to identify challenges associated with the design of the eMAR system. Results The current eMAR system does not offer an end-to-end solution for medication management. Many steps, including prescribing by doctors and communication with the community pharmacist, are still performed manually using paper charts and fax machines. Five major challenges associated with the design of eMAR system were identified: limited interactivity; inadequate flexibility; problems related to information layout and semantics; the lack of relevant decision support; and system maintenance issues.We suggest recommendations to improve the design of the eMAR system and to optimize existing workflows. Discussion Immediate value can be achieved by improving the system interactivity, reducing inconsistencies in data entry design and offering dedicated organisational support to minimise connectivity issues. Longer-term benefits can be achieved by adding decision support features and establishing system interoperability requirements with stakeholder groups (e.g. community pharmacies) prior to system roll out. In-practice evaluations of technologies like eMAR system have great value in identifying design weaknesses which inhibit optimal system use.
Resumo:
Ubiquitous computing is about making computers and computerized artefacts a pervasive part of our everyday lifes, bringing more and more activities into the realm of information. The computationalization, informationalization of everyday activities increases not only our reach, efficiency and capabilities but also the amount and kinds of data gathered about us and our activities. In this thesis, I explore how information systems can be constructed so that they handle this personal data in a reasonable manner. The thesis provides two kinds of results: on one hand, tools and methods for both the construction as well as the evaluation of ubiquitous and mobile systems---on the other hand an evaluation of the privacy aspects of a ubiquitous social awareness system. The work emphasises real-world experiments as the most important way to study privacy. Additionally, the state of current information systems as regards data protection is studied. The tools and methods in this thesis consist of three distinct contributions. An algorithm for locationing in cellular networks is proposed that does not require the location information to be revealed beyond the user's terminal. A prototyping platform for the creation of context-aware ubiquitous applications called ContextPhone is described and released as open source. Finally, a set of methodological findings for the use of smartphones in social scientific field research is reported. A central contribution of this thesis are the pragmatic tools that allow other researchers to carry out experiments. The evaluation of the ubiquitous social awareness application ContextContacts covers both the usage of the system in general as well as an analysis of privacy implications. The usage of the system is analyzed in the light of how users make inferences of others based on real-time contextual cues mediated by the system, based on several long-term field studies. The analysis of privacy implications draws together the social psychological theory of self-presentation and research in privacy for ubiquitous computing, deriving a set of design guidelines for such systems. The main findings from these studies can be summarized as follows: The fact that ubiquitous computing systems gather more data about users can be used to not only study the use of such systems in an effort to create better systems but in general to study phenomena previously unstudied, such as the dynamic change of social networks. Systems that let people create new ways of presenting themselves to others can be fun for the users---but the self-presentation requires several thoughtful design decisions that allow the manipulation of the image mediated by the system. Finally, the growing amount of computational resources available to the users can be used to allow them to use the data themselves, rather than just being passive subjects of data gathering.
Resumo:
- Background Palliative medicine and other specialists play significant legal roles in decisions to withhold and withdraw life-sustaining treatment at the end of life. Yet little is known about their knowledge of or attitudes to the law, and the role they think it should play in medical practice. Consideration of doctors’ views is critical to optimizing patient outcomes at the end of life. However, doctors are difficult to engage as participants in empirical research, presenting challenges for researchers seeking to understand doctors’ experiences and perspectives. - Aims To determine how to engage doctors involved in end-of-life care in empirical research about knowledge of the law and the role it plays in medical practice at the end of life. - Methods Postal survey of all specialists in palliative medicine, emergency medicine, geriatric medicine, intensive care, medical oncology, renal medicine, and respiratory medicine in three Australian states: New South Wales, Victoria, and Queensland. The survey was sent in hard copy with two reminders and a follow up reminder letter was also sent to the directors of hospital emergency departments. Awareness was further promoted through engagement with the relevant medical colleges and publications in professional journals; various incentives to respond were also used. The key measure is the response rate of doctors to the survey. - Results Thirty-two percent of doctors in the main study completed their survey with response rate by specialty ranging from 52% (palliative care) to 24% (medical oncology). This overall response rate was twice that of the reweighted pilot study (16%). - Conclusions Doctors remain a difficult cohort to engage in survey research but strategic recruitment efforts can be effective in increasing response rate. Collaboration with doctors and their professional bodies in both the development of the survey instrument and recruitment of participants is essential.
Resumo:
In this paper we consider HCI's role in technology interventions for health and well-being. Three projects carried out by the authors are analysed by appropriating the idea of a value chain to chart a causal history from proximal effects generated in early episodes of design through to distal health and well-being outcomes. Responding to recent arguments that favour bounding HCI's contribution to local patterns of use, we propose an unbounded view of HCI that addresses an extended value chain of influence. We discuss a view of HCI methods as mobilising this value chain perspective in multi-disciplinary collaborations through its emphasis on early prototyping and naturalistic studies of use.
Resumo:
Composting refers to aerobic degradation of organic material and is one of the main waste treatment methods used in Finland for treating separated organic waste. The composting process allows converting organic waste to a humus-like end product which can be used to increase the organic matter in agricultural soils, in gardening, or in landscaping. Microbes play a key role as degraders during the composting-process, and the microbiology of composting has been studied for decades, but there are still open questions regarding the microbiota in industrial composting processes. It is known that with the traditional, culturing-based methods only a small fraction, below 1%, of the species in a sample is normally detected. In recent years an immense diversity of bacteria, fungi and archaea has been found to occupy many different environments. Therefore the methods of characterising microbes constantly need to be developed further. In this thesis the presence of fungi and bacteria in full-scale and pilot-scale composting processes was characterised with cloning and sequencing. Several clone libraries were constructed and altogether nearly 6000 clones were sequenced. The microbial communities detected in this study were found to differ from the compost microbes observed in previous research with cultivation based methods or with molecular methods from processes of smaller scale, although there were similarities as well. The bacterial diversity was high. Based on the non-parametric coverage estimations, the number of bacterial operational taxonomic units (OTU) in certain stages of composting was over 500. Sequences similar to Lactobacillus and Acetobacteria were frequently detected in the early stages of drum composting. In tunnel stages of composting the bacterial community comprised of Bacillus, Thermoactinomyces, Actinobacteria and Lactobacillus. The fungal diversity was found to be high and phylotypes similar to yeasts were abundantly found in the full-scale drum and tunnel processes. In addition to phylotypes similar to Candida, Pichia and Geotrichum moulds from genus Thermomyces and Penicillium were observed in tunnel stages of composting. Zygomycetes were detected in the pilot-scale composting processes and in the compost piles. In some of the samples there were a few abundant phylotypes present in the clone libraries that masked the rare ones. The rare phylotypes were of interest and a method for collecting them from clone libraries for sequencing was developed. With negative selection of the abundant phylotyps the rare ones were picked from the clone libraries. Thus 41% of the clones in the studied clone libraries were sequenced. Since microbes play a central role in composting and in many other biotechnological processes, rapid methods for characterization of microbial diversity would be of value, both scientifically and commercially. Current methods, however, lack sensitivity and specificity and are therefore under development. Microarrays have been used in microbial ecology for a decade to study the presence or absence of certain microbes of interest in a multiplex manner. The sequence database collected in this thesis was used as basis for probe design and microarray development. The enzyme assisted detection method, ligation-detection-reaction (LDR) based microarray, was adapted for species-level detection of microbes characteristic of each stage of the composting process. With the use of a specially designed control probe it was established that a species specific probe can detect target DNA representing as little as 0.04% of total DNA in a sample. The developed microarray can be used to monitor composting processes or the hygienisation of the compost end product. A large compost microbe sequence dataset was collected and analysed in this thesis. The results provide valuable information on microbial community composition during industrial scale composting processes. The microarray method was developed based on the sequence database collected in this study. The method can be utilised in following the fate of interesting microbes during composting process in an extremely sensitive and specific manner. The platform for the microarray is universal and the method can easily be adapted for studying microbes from environments other than compost.
Resumo:
Nature has used the all-alpha-polypeptide backbone of proteins to create a remarkable diversity of folded structures. Sequential patterns of 20 distinct amino adds, which differ only in their side chains, determine the shape and form of proteins. Our understanding of these specific secondary structures is over half a century old and is based primarily on the fundamental elements: the Pauling alpha-helix and beta-sheet. Researchers can also generate structural diversity through the synthesis of polypeptide chains containing homologated (omega) amino acid residues, which contain a variable number of backbone atoms. However, incorporating amino adds with more atoms within the backbone introduces additional torsional freedom into the structure, which can complicate the structural analysis. Fortunately, gabapentin (Gpn), a readily available bulk drug, is an achiral beta,beta-disubstituted gamma amino add residue that contains a cyclohexyl ring at the C-beta carbon atom, which dramatically limits the range of torsion angles that can be obtained about the flanking C-C bonds. Limiting conformational flexibility also has the desirable effect of increasing peptide crystallinity, which permits unambiguous structural characterization by X-ray diffraction methods. This Account describes studies carried out in our laboratory that establish Gpn as a valuable residue in the design of specifically folded hybrid peptide structures. The insertion of additional atoms into polypeptide backbones facilitates the formation of intramolecular hydrogen bonds whose directionality is opposite to that observed in canonical alpha-peptide helices. If hybrid structures mimic proteins and biologically active peptides, the proteolytic stability conferred by unusual backbones can be a major advantage in the area of medicinal chemistry. We have demonstrated a variety of internally hydrogen-bonded structures in the solid state for Gpn-containing peptides, including the characterization of the C-7 and C-9 hydrogen bonds, which can lead to ribbons in homo-oligomeric sequences. In hybrid alpha gamma sequences, district C-12 hydrogen-bonded turn structures support formation of peptide helices and hairpins in longer sequences. Some peptides that include the Gpn residue have hydrogen-bond directionality that matches alpha-peptide helices, while others have the opposite directionality. We expect that expansion of the polypeptide backbone will lead to new classes of foldamer structures, which are thus far unknown to the world of alpha-polypeptides. The diversity of internally hydrogen-bonded structures observed in hybrid sequences containing Gpn shows promise for the rational design of novel peptide structures incorporating hybrid backbones.
Resumo:
Dispersing a data object into a set of data shares is an elemental stage in distributed communication and storage systems. In comparison to data replication, data dispersal with redundancy saves space and bandwidth. Moreover, dispersing a data object to distinct communication links or storage sites limits adversarial access to whole data and tolerates loss of a part of data shares. Existing data dispersal schemes have been proposed mostly based on various mathematical transformations on the data which induce high computation overhead. This paper presents a novel data dispersal scheme where each part of a data object is replicated, without encoding, into a subset of data shares according to combinatorial design theory. Particularly, data parts are mapped to points and data shares are mapped to lines of a projective plane. Data parts are then distributed to data shares using the point and line incidence relations in the plane so that certain subsets of data shares collectively possess all data parts. The presented scheme incorporates combinatorial design theory with inseparability transformation to achieve secure data dispersal at reduced computation, communication and storage costs. Rigorous formal analysis and experimental study demonstrate significant cost-benefits of the presented scheme in comparison to existing methods.
Resumo:
Volumetric method based adsorption measurements of nitrogen on two specimens of activated carbon (Fluka and Sarabhai) reported by us are refitted to two popular isotherms, namely, Dubunin−Astakhov (D−A) and Toth, in light of improved fitting methods derived recently. Those isotherms have been used to derive other data of relevance in design of engineering equipment such as the concentration dependence of heat of adsorption and Henry’s law coefficients. The present fits provide a better representation of experimental measurements than before because the temperature dependence of adsorbed phase volume and structural heterogeneity of micropore distribution have been accounted for in the D−A equation. A new correlation to the Toth equation is a further contribution. The heat of adsorption in the limiting uptake condition is correlated with the Henry’s law coefficients at the near zero uptake condition.
Resumo:
An adaptive drug delivery design is presented in this paper using neural networks for effective treatment of infectious diseases. The generic mathematical model used describes the coupled evolution of concentration of pathogens, plasma cells, antibodies and a numerical value that indicates the relative characteristic of a damaged organ due to the disease under the influence of external drugs. From a system theoretic point of view, the external drugs can be interpreted as control inputs, which can be designed based on control theoretic concepts. In this study, assuming a set of nominal parameters in the mathematical model, first a nonlinear controller (drug administration) is designed based on the principle of dynamic inversion. This nominal drug administration plan was found to be effective in curing "nominal model patients" (patients whose immunological dynamics conform to the mathematical model used for the control design exactly. However, it was found to be ineffective in curing "realistic model patients" (patients whose immunological dynamics may have off-nominal parameter values and possibly unwanted inputs) in general. Hence, to make the drug delivery dosage design more effective for realistic model patients, a model-following adaptive control design is carried out next by taking the help of neural networks, that are trained online. Simulation studies indicate that the adaptive controller proposed in this paper holds promise in killing the invading pathogens and healing the damaged organ even in the presence of parameter uncertainties and continued pathogen attack. Note that the computational requirements for computing the control are very minimal and all associated computations (including the training of neural networks) can be carried out online. However it assumes that the required diagnosis process can be carried out at a sufficient faster rate so that all the states are available for control computation.
Resumo:
Purpose The research purpose was to identify both the inspiration sources used by fast fashion designers and ways the designers sort information from the sources during the product development process. Design/methodology/approach This is a qualitative study, drawing on semi-structured interviews conducted with the members of the in-house design teams of three Australian fast fashion companies. Findings Australian fast fashion designers rely on a combination of trend data, sales data, product analysis and travel for design development ideas. The designers then use the consensus and embodiment methods to interpret and synthesise information from those inspiration sources. Research limitations/implications The empirical data used in the analysis were limited by interviewing fashion designers within only three Australian companies. Originality/value This research augments knowledge of fast fashion product development, in particular designers’ methods and approaches to product design within a volatile and competitive market.
Resumo:
Product success is substantially influenced by satisfaction of knowledge needs of designers, and many tools and methods have been proposed to support these needs. However, adoption of these methods in industry is minimal. This may be due to an inadequate understanding of the knowledge needs of designers in industry. This research attempts to develop a better understanding of these needs by undertaking descriptive studies in an industry. We propose a taxonomy of knowledge, and evaluate this by analyzing the questions asked by the designers involved in the study during their interactions. Using the taxonomy, we converted the questions asked into a generic form. The generic questions provide an understanding about what knowledge must be captured during design, and what its structure should be.