968 resultados para Stone Tool Function
Resumo:
This chapter provides an analysis of feedback from key stakeholders, collected as part of a research project, on the problems and tensions evident in the collective work practices of learning advisers employed in learning assistance services at an Australian metropolitan university (Peach, 2003). The term 'learning assistance' is used in the Australian higher education sector generally to refer to student support services that include assistance with academic writing and other study skills. The aim of the study was to help learning advisers and other key stakeholders develop a better understanding of the work activity with a view to using this understanding to generate improvements in service provision. Over twenty problems and associated tensions were identified through stakeholder feedback however the focus of this chapter is the analysis of tensions related to a cluster of problems referred to as cost-efficiency versus quality service. Theoretical modelling derived from the tools made available through cultural historical activity theory and expansive visibilsation (Engestrom and Miettinen, 1999) and excerpts from data are used to illustrate how different understandings of the purpose of learning assistance services impacts on the work practices of learning advisers and creates problems and tensions in relation to the type of service available (including use of technology),level of service available, and learning adviser workload.
Resumo:
The Early Years Generalising Project involves Australian students, Years 1-4 (age 5-9), and explores how the students grasp and express generalisations. This paper focuses on the data collected from clinical interviews with Year 3 and 4 cohorts in an investigative study focusing on the identifications, prediction and justification of function rules. It reports on students' attempts to generalise from function machine contexts, describing the various ways students express generalisation and highlighting the different levels of justification given by students. Finally, we conjecture that there are a set of stages in the expression and justification of generalisations that assist students to reach generality within tasks.
Resumo:
Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.
Resumo:
With the current curriculum focus on correlating classroom problem solving lessons to real-world contexts, are LEGO robotics an effective problem solving tool? This present study was designed to investigate this question and to ascertain what problem solving strategies primary students engaged with when working with LEGO robotics and whether the students were able to effectively relate their problem solving strategies to real-world contexts. The qualitative study involved 23 Grade 6 students participating in robotics activities at a Brisbane primary school. The study included data collected from researcher observations of student problem solving discussions, collected software programs, and data from a student completed questionnaire. Results from the study indicated that the robotic activities assisted students to reflect on the problem solving decisions they made. The study also highlighted that the students were able to relate their problem solving strategies to real-world contexts. The study demonstrated that while LEGO robotics can be considered useful problem solving tools in the classroom, careful teacher scaffolding needs to be implemented in regards to correlating LEGO with authentic problem solving. Further research in regards to how teachers can best embed realworld contexts into effective robotics lessons is recommended.
Resumo:
Increasingly, studies are reported that examine how conceptual modeling is conducted in practice. Yet, typically the studies to date have examined in isolation how modeling grammars can be, or are, used to develop models of information systems or organizational processes, without considering that such modeling is typically done by means of a modeling tool that extends the modeling functionality offered by a grammar through complementary features. This paper extends the literature by examining how the use of seven different features of modeling tools affects usage beliefs users develop when using modeling grammars for process modeling. We show that five distinct tool features positively affect usefulness, ease of use and satisfaction beliefs of users. We offer a number of interpretations about the findings. We also describe how the results inform decisions of relevance to developers of modeling tools as well as managers in charge for making modeling-related investment decisions.
Resumo:
Proteases regulate a spectrum of diverse physiological processes, and dysregulation of proteolytic activity drives a plethora of pathological conditions. Understanding protease function is essential to appreciating many aspects of normal physiology and progression of disease. Consequently, development of potent and specific inhibitors of proteolytic enzymes is vital to provide tools for the dissection of protease function in biological systems and for the treatment of diseases linked to aberrant proteolytic activity. The studies in this thesis describe the rational design of potent inhibitors of three proteases that are implicated in disease development. Additionally, key features of the interaction of proteases and their cognate inhibitors or substrates are analysed and a series of rational inhibitor design principles are expounded and tested. Rational design of protease inhibitors relies on a comprehensive understanding of protease structure and biochemistry. Analysis of known protease cleavage sites in proteins and peptides is a commonly used source of such information. However, model peptide substrate and protein sequences have widely differing levels of backbone constraint and hence can adopt highly divergent structures when binding to a protease’s active site. This may result in identical sequences in peptides and proteins having different conformations and diverse spatial distribution of amino acid functionalities. Regardless of this, protein and peptide cleavage sites are often regarded as being equivalent. One of the key findings in the following studies is a definitive demonstration of the lack of equivalence between these two classes of substrate and invalidation of the common practice of using the sequences of model peptide substrates to predict cleavage of proteins in vivo. Another important feature for protease substrate recognition is subsite cooperativity. This type of cooperativity is commonly referred to as protease or substrate binding subsite cooperativity and is distinct from allosteric cooperativity, where binding of a molecule distant from the protease active site affects the binding affinity of a substrate. Subsite cooperativity may be intramolecular where neighbouring residues in substrates are interacting, affecting the scissile bond’s susceptibility to protease cleavage. Subsite cooperativity can also be intermolecular where a particular residue’s contribution to binding affinity changes depending on the identity of neighbouring amino acids. Although numerous studies have identified subsite cooperativity effects, these findings are frequently ignored in investigations probing subsite selectivity by screening against diverse combinatorial libraries of peptides (positional scanning synthetic combinatorial library; PS-SCL). This strategy for determining cleavage specificity relies on the averaged rates of hydrolysis for an uncharacterised ensemble of peptide sequences, as opposed to the defined rate of hydrolysis of a known specific substrate. Further, since PS-SCL screens probe the preference of the various protease subsites independently, this method is inherently unable to detect subsite cooperativity. However, mean hydrolysis rates from PS-SCL screens are often interpreted as being comparable to those produced by single peptide cleavages. Before this study no large systematic evaluation had been made to determine the level of correlation between protease selectivity as predicted by screening against a library of combinatorial peptides and cleavage of individual peptides. This subject is specifically explored in the studies described here. In order to establish whether PS-SCL screens could accurately determine the substrate preferences of proteases, a systematic comparison of data from PS-SCLs with libraries containing individually synthesised peptides (sparse matrix library; SML) was carried out. These SML libraries were designed to include all possible sequence combinations of the residues that were suggested to be preferred by a protease using the PS-SCL method. SML screening against the three serine proteases kallikrein 4 (KLK4), kallikrein 14 (KLK14) and plasmin revealed highly preferred peptide substrates that could not have been deduced by PS-SCL screening alone. Comparing protease subsite preference profiles from screens of the two types of peptide libraries showed that the most preferred substrates were not detected by PS SCL screening as a consequence of intermolecular cooperativity being negated by the very nature of PS SCL screening. Sequences that are highly favoured as result of intermolecular cooperativity achieve optimal protease subsite occupancy, and thereby interact with very specific determinants of the protease. Identifying these substrate sequences is important since they may be used to produce potent and selective inhibitors of protolytic enzymes. This study found that highly favoured substrate sequences that relied on intermolecular cooperativity allowed for the production of potent inhibitors of KLK4, KLK14 and plasmin. Peptide aldehydes based on preferred plasmin sequences produced high affinity transition state analogue inhibitors for this protease. The most potent of these maintained specificity over plasma kallikrein (known to have a very similar substrate preference to plasmin). Furthermore, the efficiency of this inhibitor in blocking fibrinolysis in vitro was comparable to aprotinin, which previously saw clinical use to reduce perioperative bleeding. One substrate sequence particularly favoured by KLK4 was substituted into the 14 amino acid, circular sunflower trypsin inhibitor (SFTI). This resulted in a highly potent and selective inhibitor (SFTI-FCQR) which attenuated protease activated receptor signalling by KLK4 in vitro. Moreover, SFTI-FCQR and paclitaxel synergistically reduced growth of ovarian cancer cells in vitro, making this inhibitor a lead compound for further therapeutic development. Similar incorporation of a preferred KLK14 amino acid sequence into the SFTI scaffold produced a potent inhibitor for this protease. However, the conformationally constrained SFTI backbone enforced a different intramolecular cooperativity, which masked a KLK14 specific determinant. As a consequence, the level of selectivity achievable was lower than that found for the KLK4 inhibitor. Standard mechanism inhibitors such as SFTI rely on a stable acyl-enzyme intermediate for high affinity binding. This is achieved by a conformationally constrained canonical binding loop that allows for reformation of the scissile peptide bond after cleavage. Amino acid substitutions within the inhibitor to target a particular protease may compromise structural determinants that support the rigidity of the binding loop and thereby prevent the engineered inhibitor reaching its full potential. An in silico analysis was carried out to examine the potential for further improvements to the potency and selectivity of the SFTI-based KLK4 and KLK14 inhibitors. Molecular dynamics simulations suggested that the substitutions within SFTI required to target KLK4 and KLK14 had compromised the intramolecular hydrogen bond network of the inhibitor and caused a concomitant loss of binding loop stability. Furthermore in silico amino acid substitution revealed a consistent correlation between a higher frequency of formation and the number of internal hydrogen bonds of SFTI-variants and lower inhibition constants. These predictions allowed for the production of second generation inhibitors with enhanced binding affinity toward both targets and highlight the importance of considering intramolecular cooperativity effects when engineering proteins or circular peptides to target proteases. The findings from this study show that although PS-SCLs are a useful tool for high throughput screening of approximate protease preference, later refinement by SML screening is needed to reveal optimal subsite occupancy due to cooperativity in substrate recognition. This investigation has also demonstrated the importance of maintaining structural determinants of backbone constraint and conformation when engineering standard mechanism inhibitors for new targets. Combined these results show that backbone conformation and amino acid cooperativity have more prominent roles than previously appreciated in determining substrate/inhibitor specificity and binding affinity. The three key inhibitors designed during this investigation are now being developed as lead compounds for cancer chemotherapy, control of fibrinolysis and cosmeceutical applications. These compounds form the basis of a portfolio of intellectual property which will be further developed in the coming years.
Resumo:
Buildings are one of the most significant infrastructures in modern societies. The construction and operation of modern buildings consume a considerable amount of energy and materials, therefore contribute significantly to the climate change process. In order to reduce the environmental impact of buildings, various green building rating tools have been developed. In this paper, energy uses of the building sector in Australia and over the world are first reviewed. This is then followed by discussions on the development and scopes of various green building rating tools, with a particular focus on the Green Star rating scheme developed in Australia. It is shown that Green Star has significant implications on almost every aspect of the design of HVAC systems, including the selection of air handling and distribution systems, fluid handling systems, refrigeration systems, heat rejection systems and building control systems.
Resumo:
Introduction This study reports on the development of a self report assessment tool to increase the efficacy of crash prediction within Australian Fleet settings Over last 20 years an array of measures have been produced (Driver anger scale, Driving Skill Inventory, Manchester Driver Behaviour Questionnaire, Driver Attitude Questionnaire, Driver Stress Inventory, Safety Climate Questionnaire) While these tools are useful, research has demonstrated limited ability to accurately identify individuals most likely to be involved in a crash. Reasons cited include; - Crashes are relatively rare - Other competing factors may influence crash event - Ongoing questions regarding the validity of self report measures (common method variance etc) - Lack of contemporary issues relating to fleet driving performance
Resumo:
In this article we identify how computational automation achieved through programming has enabled a new class of music technologies with generative music capabilities. These generative systems can have a degree of music making autonomy that impacts on our relationships with them; we suggest that this coincides with a shift in the music-equipment relationship from tool use to a partnership. This partnership relationship can occur when we use technologies that display qualities of agency. It raises questions about the kinds of skills and knowledge that are necessary to interact musically in such a partnership. These are qualities of musicianship we call eBility. In this paper we seek to define what eBility might consist of and how consideration of it might effect music education practice. The 'e' in eBility refers not only to the electronic nature of computing systems but also to the ethical, enabling, experiential and educational dimensions of the creative relationship with technologies with agency. We hope to initiate a discussion around differentiating what we term representational technologies from those with agency and begin to uncover the implications of these ideas for music educators in schools and communities. We hope also to elucidate the emergent theory and practice that has enabled the development of strategies for optimising this kind of eBility where the tool becomes partner. The identification of musical technologies with agency adds to the authors’ list of metaphors for technology use in music education that previously included tool, medium and instrument. We illustrate these ideas with examples and with data from our work with the jam2jam interactive music system. In this discussion we will outline our experiences with jam2jam as an example of a technology with agency and describe the aspects of eBility that interaction with it promotes.
Resumo:
This paper proposes a model-based technique for lowering the entrance barrier for service providers to register services with a marketplace broker, such that the service is rapidly configured to utilize the brokerpsilas local service delivery management components. Specifically, it uses process modeling for supporting the execution steps of a service and shows how service delivery functions (e.g. payment points) ldquolocalrdquo to a service broker can be correctly configured into the process model. By formalizing the different operations in a service delivery function (like payment or settlement) and their allowable execution sequences (full payments must follow partial payments), including cross-function dependencies, it shows how through tool support, the non-technical user can quickly configure service delivery functions in a consistent and complete way.
Resumo:
Prostate cancer is a significant health problem faced by aging men. Currently, diagnostic strategies for the detection of prostate cancer are either unreliable, yielding high numbers of false positive results, or too invasive to be used widely as screening tests. Furthermore, the current therapeutic strategies for the treatment of the disease carry considerable side effects. Although organ confined prostate cancer can be curable, most detectable clinical symptoms occur in advanced disease when primary tumour cells have metastasised to distant sites - usually lymph nodes and bone. Many growth factors and steroids assist the continued growth and maintenance of prostatic tumour cells. Of these mitogens, androgens are important in the development of the normal prostate but are also required to sustain the growth of prostate cancer cells in the early stage of the disease. Not only are androgens required in the early stage of disease, but also many other growth factors and hormones interact to cause uncontrolled proliferation of malignant cells. The early, androgen sensitive phase of disease is followed by an androgen insensitive phase, whereby androgens are no longer required to stimulate the growth of the tumour cells. Growth factors such as transforming growth factor and (TGF/), epidermal growth factor (EGF), basic fibroblast growth factor (bFGF), insulin-like growth factors (IGFs), Vitamin D and thyroid hormone have been suggested to be important at this stage of disease. Interestingly, some of the kallikrein family of genes, including prostate specific antigen (PSA), the current serum diagnostic marker for prostate cancer, are regulated by androgens and many of the aforementioned growth factors. The kallikrein gene family is a group of serine proteases that are involved in a diverse range of physiological processes: regulation of local blood flow, angiogenesis, tissue invasion and mitogenesis. The earliest members of the kallikrein gene family (KLK1-KLK3) have been strongly associated with general disease states, such as hypertension, inflammation, pancreatitis and renal disease, but are also linked to various cancers. Recently, this family was extended to include 15 genes (KLK1-15). Several newer members of the kallikrein family have been implicated in the carcinogenesis and tumour metastasis of hormone-dependent cancers such as prostate, breast, endometrial and ovarian cancer. The aims of this project were to investigate the expression of the newly identified kallikrein, KLK4, in benign and malignant prostate tissues, and prostate cancer cell lines. This thesis has demonstrated the elevated expression of KLK4 mRNA transcripts in malignant prostate tissue compared to benign prostates. Additionally, expression of the full length KLK4 transcript was detected in the androgen dependent prostate cancer cell line, LNCaP. Based on the above finding, the LNCaP cell line was chosen to assess the potential regulation of full length KLK4 by androgen, thyroid hormone and epidermal growth factor. KLK4 mRNA and protein was found to be up-regulated by androgen and a combination of androgen and thyroid hormone. Thyroid hormone alone produced no significant change in KLK4 mRNA or protein over the control. Epidermal growth factor treatment also resulted in elevated expression levels of KLK4 mRNA and protein. To assess the potential functional role(s) of KLK4/hK4 in processes associated with tumour progression, full length KLK4 was transfected into PC-3 cells - a prostate cancer cell line originally derived from a secondary bone lesion. The KLK4/hK4 over-expressing cells were assessed for their proliferation, migration, invasion and attachment properties. The KLK4 over-expressing clones exhibited a marked change in morphology, indicative of a more aggressive phenotype. The KLK4 clones were irregularly shaped with compromised adhesion to the growth surface. In contrast, the control cell lines (parent PC-3 and empty vector clones) retained a rounded morphology with obvious cell to cell adhesion, as well as significant adhesion to their growth surface. The KLK4 clones exhibited significantly greater attachment to Collagen I and IV than native PC-3s and empty vector controls. Over a 12 hour period, in comparison to the control cells, the KLK4 clones displayed an increase in migration towards PC-3 native conditioned media, a 3 fold increase towards conditioned media from an osteoblastic cell line (Saos-2) and no change in migration towards conditioned media from neonatal foreskin fibroblast cells or 20% foetal bovine serum. Furthermore, the increase in migration exhibited by the KLK4 clones was partially blocked by the serine protease inhibitor, aprotinin. The data presented in this thesis suggests that KLK4/hK4 is important in prostate carcinogenesis due to its over-expression in malignant prostate tissues, its regulation by hormones and growth factors associated with prostate disease and the functional consequences of over-expression of KLK4/hK4 in the PC-3 cell line. These results indicate that KLK4/hK4 may play an important role in tumour invasion and bone metastasis via increased attachment to the bone matrix protein, Collagen I, and enhanced migration due to soluble factors produced by osteoblast cells. This suggestion is further supported by the morphological changes displayed by the KLK4 over-expressing cells. Overall, this data suggests that KLK4/hK4 should be further studied to more fully investigate the potential value of KLK4/hK4 as a diagnostic/prognostic biomarker or in therapeutic applications.
Resumo:
As civil infrastructures such as bridges age, there is a concern for safety and a need for cost-effective and reliable monitoring tool. Different diagnostic techniques are available nowadays for structural health monitoring (SHM) of bridges. Acoustic emission is one such technique with potential of predicting failure. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). AEtechnique involves recording the stress waves bymeans of sensors and subsequent analysis of the recorded signals,which then convey information about the nature of the source. AE can be used as a local SHM technique to monitor specific regions with visible presence of cracks or crack prone areas such as welded regions and joints with bolted connection or as a global technique to monitor the whole structure. Strength of AE technique lies in its ability to detect active crack activity, thus helping in prioritising maintenance work by helping focus on active cracks rather than dormant cracks. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. One is the generation of large amount of data during the testing; hence an effective data analysis and management is necessary, especially for long term monitoring uses. Complications also arise as a number of spurious sources can giveAEsignals, therefore, different source discrimination strategies are necessary to identify genuine signals from spurious ones. Another major challenge is the quantification of damage level by appropriate analysis of data. Intensity analysis using severity and historic indices as well as b-value analysis are some important methods and will be discussed and applied for analysis of laboratory experimental data in this paper.
Resumo:
Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.
Resumo:
Design for Manufacturing (DFM) is a highly integral methodology in product development, starting from the concept development phase, with the aim of improving manufacturing productivity. It is used to reduce manufacturing costs in complex production environments, while maintaining product quality. While Design for Assembly (DFA) is focusing on elimination or combination of parts with other components, which in most cases relates to performing a function and manufacture operation in a simpler way, DFM is following a more holistic approach. Common consideration for DFM are standard components, manufacturing tool inventory and capability, materials compatibility with production process, part handling, logistics, tool wear and process optimization, quality control complexity or Poka-Yoke design. During DFM, the considerable background work required for the conceptual phase is compensated for by a shortening of later development phases. Current DFM projects normally apply an iterative step-by-step approach and eventually transfer to the developer team. The study is introducing a new, knowledge based approach to DFM, eliminating steps of DFM, and showing implications on the work process. Furthermore, a concurrent engineering process via transparent interface between the manufacturing engineering and product development systems is brought forward.