238 resultados para functionality


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The management of risks in business processes has been a subject of active research in the past few years. Many benefits can potentially be obtained by integrating the two traditionally-separated fields of risk management and business process management, including the ability to minimize risks in business processes (by design) and to mitigate risks at run time. In the past few years, an increasing amount of research aimed at delivering such an integrated system has been proposed. However, these research efforts vary in terms of their scope, goals, and functionality. Through systematic collection and evaluation of relevant literature, this paper compares and classifies current approaches in the area of risk-aware business process management in order to identify and explain relevant research gaps. The process through which relevant literature is collected, filtered, and evaluated is also detailed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

‘Wearable technology’, or the use of specialist technology in garments, is promoted by the electronics industry as the next frontier of fashion. However the story of wearable technology’s relationship with fashion begins neither with the development of miniaturised computers in the 1970s nor with sophisticated ‘smart textiles’ of the twenty-first century, despite what much of the rhetoric suggests. This study examines wearable technology against a longer history of fashion, highlighted by the influential techno-sartorial experiments of a group of early twentieth century avant-gardes including Italian Futurists Giacomo Balla and F.T. Marinetti, Russian Constructivists Varvara Stepanova and Liubov Popova, and Paris-based Cubist, Sonia Delaunay. Through the interdisciplinary framework of fashion studies, the thesis provides a fuller picture of wearable technology framed by the idea of utopia. Using comparative analysis, and applying the theoretical formulations of Fredric Jameson, Louis Marin and Michael Carter, the thesis traces the appearance of three techno-utopian themes from their origins in the machine age experiments of Balla, Marinetti, Stepanova, Popova and Delaunay to their twenty-first century reappearance in a dozen wearable technology projects. By exploring the central thesis that contemporary wearable technology resurrects the techno-utopian ideas and expressions of the early twentieth century, the study concludes that the abiding utopian impetus to embed technology in the aesthetics (prints, silhouettes, and fabrication) and functionality of fashion is to unify subject, society and environment under a totalising technological order.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building Web 2.0 sites does not necessarily ensure the success of the site. We aim to better understand what improves the success of a site by drawing insight from biologically inspired design patterns. Web 2.0 sites provide a mechanism for human interaction enabling powerful intercommunication between massive volumes of users. Early Web 2.0 site providers that were previously dominant are being succeeded by newer sites providing innovative social interaction mechanisms. Understanding what site traits contribute to this success drives research into Web sites mechanics using models to describe the associated social networking behaviour. Some of these models attempt to show how the volume of users provides a self-organising and self-contextualisation of content. One model describing coordinated environments is called stigmergy, a term originally describing coordinated insect behavior. This paper explores how exploiting stigmergy can provide a valuable mechanism for identifying and analysing online user behavior specifically when considering that user freedom of choice is restricted by the provided web site functionality. This will aid our building better collaborative Web sites improving the collaborative processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Navigation through tessellated solids in GEANT4 can degrade computational performance, especially if the tessellated solid is large and is comprised of many facets. Redefining a tessellated solid as a mesh of tetrahedra is common in other computational techniques such as finite element analysis as computations need only consider local tetrahedrons rather than the tessellated solid as a whole. Here within we describe a technique that allows for automatic tetrahedral meshing of tessellated solids in GEANT4 and the subsequent loading of these meshes as assembly volumes; loading nested tessellated solids and tetrahedral meshes is also examined. As the technique makes the geometry suitable for automatic optimisation using smartvoxels, navigation through a simple tessellated volume has been found to be more than two orders of magnitude faster than that through the equivalent tessellated solid. Speed increases of more than two orders of magnitude were also observed for a more complex tessellated solid with voids and concavities. The technique was benchmarked for geometry load time, simulation run time and memory usage. Source code enabling the described functionality in GEANT4 has been made freely available on the Internet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Queensland University of Technology (QUT) was one of the first universities in Australia to establish an institutional repository. Launched in November 2003, the repository (QUT ePrints) uses the EPrints open source repository software (from Southampton) and has enjoyed the benefit of an institutional deposit mandate since January 2004. Currently (April 2012), the repository holds over 36,000 records, including 17,909 open access publications with another 2,434 publications embargoed but with mediated access enabled via the ‘Request a copy’ button which is a feature of the EPrints software. At QUT, the repository is managed by the library.QUT ePrints (http://eprints.qut.edu.au) The repository is embedded into a number of other systems at QUT including the staff profile system and the University’s research information system. It has also been integrated into a number of critical processes related to Government reporting and research assessment. Internally, senior research administrators often look to the repository for information to assist with decision-making and planning. While some statistics could be drawn from the advanced search feature and the existing download statistics feature, they were rarely at the level of granularity or aggregation required. Getting the information from the ‘back end’ of the repository was very time-consuming for the Library staff. In 2011, the Library funded a project to enhance the range of statistics which would be available from the public interface of QUT ePrints. The repository team conducted a series of focus groups and individual interviews to identify and prioritise functionality requirements for a new statistics ‘dashboard’. The participants included a mix research administrators, early career researchers and senior researchers. The repository team identified a number of business criteria (eg extensible, support available, skills required etc) and then gave each a weighting. After considering all the known options available, five software packages (IRStats, ePrintsStats, AWStats, BIRT and Google Urchin/Analytics) were thoroughly evaluated against a list of 69 criteria to determine which would be most suitable. The evaluation revealed that IRStats was the best fit for our requirements. It was deemed capable of meeting 21 out of the 31 high priority criteria. Consequently, IRStats was implemented as the basis for QUT ePrints’ new statistics dashboards which were launched in Open Access Week, October 2011. Statistics dashboards are now available at four levels; whole-of-repository level, organisational unit level, individual author level and individual item level. The data available includes, cumulative total deposits, time series deposits, deposits by item type, % fulltexts, % open access, cumulative downloads, time series downloads, downloads by item type, author ranking, paper ranking (by downloads), downloader geographic location, domains, internal v external downloads, citation data (from Scopus and Web of Science), most popular search terms, non-search referring websites. The data is displayed in charts, maps and table format. The new statistics dashboards are a great success. Feedback received from staff and students has been very positive. Individual researchers have said that they have found the information to be very useful when compiling a track record. It is now very easy for senior administrators (including the Deputy Vice Chancellor-Research) to compare the full-text deposit rates (i.e. mandate compliance rates) across organisational units. This has led to increased ‘encouragement’ from Heads of School and Deans in relation to the provision of full-text versions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a need for an accurate real-time quantitative system that would enhance decision-making in the treatment of osteoarthritis. To achieve this objective, significant research is required that will enable articular cartilage properties to be measured and categorized for health and functionality without the need for laboratory tests involving biopsies for pathological evaluation. Such a system would provide the capability of access to the internal condition of the cartilage matrix and thus extend the vision-based arthroscopy that is currently used beyond the subjective evaluation of surgeons. The system required must be able to non-destructively probe the entire thickness of the cartilage and its immediate subchondral bone layer. In this thesis, near infrared spectroscopy is investigated for the purpose mentioned above. The aim is to relate it to the structure and load bearing properties of the cartilage matrix to the near infrared absorption spectrum and establish functional relationships that will provide objective, quantitative and repeatable categorization of cartilage condition outside the area of visible degradation in a joint. Based on results from traditional mechanical testing, their innovative interpretation and relationship with spectroscopic data, new parameters were developed. These were then evaluated for their consistency in discriminating between healthy viable and degraded cartilage. The mechanical and physico-chemical properties were related to specific regions of the near infrared absorption spectrum that were identified as part of the research conducted for this thesis. The relationships between the tissue's near infrared spectral response and the new parameters were modeled using multivariate statistical techniques based on partial least squares regression (PLSR). With significantly high levels of statistical correlation, the modeled relationships were demonstrated to possess considerable potential in predicting the properties of unknown tissue samples in a quick and non-destructive manner. In order to adapt near infrared spectroscopy for clinical applications, a balance between probe diameter and the number of active transmit-receive optic fibres must be optimized. This was achieved in the course of this research, resulting in an optimal probe configuration that could be adapted for joint tissue evaluation. Furthermore, as a proof-of-concept, a protocol for obtaining the new parameters from the near infrared absorption spectra of cartilage was developed and implemented in a graphical user interface (GUI)-based software, and used to assess cartilage-on-bone samples in vitro. This conceptual implementation has been demonstrated, in part by the individual parametric relationship with the near infrared absorption spectrum, the capacity of the proposed system to facilitate real-time, non-destructive evaluation of cartilage matrix integrity. In summary, the potential of the optical near infrared spectroscopy for evaluating articular cartilage and bone laminate has been demonstrated in this thesis. The approach could have a spin-off for other soft tissues and organs of the body. It builds on the earlier work of the group at QUT, enhancing the near infrared component of the ongoing research on developing a tool for cartilage evaluation that goes beyond visual and subjective methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For over half a century art directors within the advertising industry have been adapting to the changes occurring in media, culture and the corporate sector, toward enhancing professional performance and competitiveness. These professionals seldom offer explicit justification about the role images play in effective communication. It is uncertain how this situation affects advertising performance, because advertising has, nevertheless, evolved in parallel to this as an industry able to fabricate new opportunities for itself. However, uncertainties in the formalization of art direction knowledge restrict the possibilities of knowledge transfer in higher education. The theoretical knowledge supporting advertising art direction has been adapted spontaneously from disciplines that rarely focus on specific aspects related to the production of advertising content, like, for example: marketing communication, design, visual communication, or visual art. Meanwhile, in scholarly research, vast empirical knowledge has been generated about advertising images, but often with limited insight into production expertise. Because art direction is understood as an industry practice and not as an academic discipline, an art direction perspective in scholarly contributions is rare. Scholarly research that is relevant to art direction seldom offers viewpoints to help understand how it is that research outputs may specifically contribute to art direction practices. This thesis is dedicated to formally understanding the knowledge underlying art direction and using it to explore models for visual analysis and knowledge transfer in higher education. The first three chapters of this thesis offer, firstly, a review of practical and contextual aspects that help define art direction, as a profession and as a component in higher education; secondly, a discussion about visual knowledge; and thirdly, a literature review of theoretical and analytic aspects relevant to art direction knowledge. Drawing on these three chapters, this thesis establishes explicit structures to help in the development of an art direction curriculum in higher education programs. Following these chapters, this thesis explores a theoretical combination of the terms ‘aesthetics’ and ‘strategy’ as foundational notions for the study of art direction. The theoretical exploration of the term ‘strategic aesthetics’ unveils the potential for furthering knowledge in visual commercial practices in general. The empirical part of this research explores ways in which strategic aesthetics notions can extend to methodologies of visual analysis. Using a combination of content analysis and of structures of interpretive analysis offered in visual anthropology, this research discusses issues of methodological appropriation as it shifts aspects of conventional methodologies to take into consideration paradigms of research that are producer-centred. Sampled out of 2759 still ads from the online databases of Cannes Lions Festival, this study uses an instrumental case study of love-related advertising to facilitate the analysis of content. This part of the research helps understand the limitations and functionality of the theoretical and methodological framework explored in the thesis. In light of the findings and discussions produced throughout the thesis, this project aims to provide directions for higher education in relation to art direction and highlights potential pathways for further investigation of strategic aesthetics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an approach to derive requirements for an avionics architecture that provides onboard sense-and-avoid and autonomous emergency forced landing capabilities to a UAS. The approach is based on two design paradigms that (1) derive requirements analyzing the common functionality between these two functions to then derive requirements for sensors, computing capability, interfaces, etc. (2) consider the risk and safety mitigation associated with these functions to derive certification requirements for the system design. We propose to use the Aircraft Certification Matrix (ACM) approach to tailor the system Development Assurance Levels (DAL) and architecture requirements in accordance with acceptable risk criteria. This architecture is developed under the name “Flight Guardian”. Flight Guardian is an avionics architecture that integrates common sensory elements that are essential components of any UAS that is required to be dependable. The Flight Guardian concept is also applicable to conventionally piloted aircraft, where it will serve to reduce cockpit workload.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article will consider the role that Alternative Dispute Resolution (‘Dham Kha Chen Ki Khendum’ or ‘Nangkha Nangdrik’) currently plays in resolving legal conflict in Bhutan. With a Constitution that has committed to the pursuit of Gross National Happiness, non-adversarial dispute resolution processes that promote continuing relationships and goodwill assume greater importance. One difficulty for Bhutan is that alternative dispute resolution procedures such as mediation (Dhum Drik) are being referred to in enactments of the Bhutanese National Council and National Assembly (bicameral parliament), without a shared understanding as to the characteristics and functionality of these procedures. This article will focus particularly on the current practice of mediation in Bhutan and investigate whether particular models of mediation are more suited to the Bhutanese context, given the particularities of Bhutanese culture, the search for gross national happiness, psychological understandings of happiness and the omnipresent influence of Mahayana Buddhism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetic recombination is a fundamental evolutionary mechanism promoting biological adaptation. Using engineered recombinants of the small single-stranded DNA plant virus, Maize streak virus (MSV), we experimentally demonstrate that fragments of genetic material only function optimally if they reside within genomes similar to those in which they evolved. The degree of similarity necessary for optimal functionality is correlated with the complexity of intragenomic interaction networks within which genome fragments must function. There is a striking correlation between our experimental results and the types of MSV recombinants that are detectable in nature, indicating that obligatory maintenance of intragenome interaction networks strongly constrains the evolutionary value of recombination for this virus and probably for genomes in general.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The global road safety problem The role of human factors in road crashes The use of driving simulators in road safety research The CARRS-Q advanced driving simulator –Functionality –Problems encountered and related solutions Past and current projects using the driving simulator Limitations of driving simulators

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of the cellular glycoprotein vitronectin acts as a biological ‘glue’ and key controller of mammalian tissue repair and remodelling activity is emerging from nearly 50 years of experimental in vitro and in vivo data. Unexpectedly, the vitronectin-knock-out mouse was found to be viable and to have largely normal phenotype. However, diligent observation revealed that the VN-KO animal exhibits delayed coagulation and poor wound healing. This is interpreted to indicate that vitronectin occupies a role in the earliest events of thrombogenesis and tissue repair. That role is as a foundation upon which the thrombus grows in an organised structure. In addition to closing the wound, the thrombus also serves to protect the underlying tissue from oxidation, is a reservoir of mitogens and tissue repair mediators and provides a provisional scaffold for the repairing tissue. In the absence of vitronectin (e.g. VN-KO animal) this cascade is disrupted before it begins. Our data demonstrates that a wide variety of biologically active species associate with VN. While initial studies were focused on mitogens, other classes of bioactives (e.g. glycosaminoglycans, metalloproteinases) are now also known to specifically interact with VN. Many of these interactions are long-lived, often resulting in multi-protein complexes, while others are stable for prolonged periods. Multiprotein complexes provide several advantages: prolonging molecular interaction; sustaining local concentrations, facilitating co-stimulation of cell surface receptors and thereby enhancing cellular / biological responses. We contend that these, or equivalent, multi-protein complexes mediate vitronectin functionality in vivo. It is also likely that many of the species demonstrated to associate with vitronectin in vitro, also associate with vitronectin in vivo in similar multi-protein complexes. Thus the predominant biological function of vitronectin is that of a master controller of the extracellular environment; informing, and possibly instructing cells ‘where’ to behave, ‘when’ to behave, and ‘how’ to behave (i.e. appropriately for the current circumstance).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While highly cohesive groups are potentially advantageous they are also often correlated with the emergence of knowledge and information silos based around those same functional or occupational clusters. Consequently, an essential challenge for engineering organisations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. This paper acts as a primer for those seeking to gain an understanding of the design, functionality and utility of a suite of software tools generically termed social media technologies in the context of optimising the management of tacit engineering knowledge. Underpinned by knowledge management theory and using detailed case examples, this paper explores how social media technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering environments.