891 resultados para Technicolor and Composite Models
Resumo:
Continuum diffusion models are often used to represent the collective motion of cell populations. Most previous studies have simply used linear diffusion to represent collective cell spreading, while others found that degenerate nonlinear diffusion provides a better match to experimental cell density profiles. In the cell modeling literature there is no guidance available with regard to which approach is more appropriate for representing the spreading of cell populations. Furthermore, there is no knowledge of particular experimental measurements that can be made to distinguish between situations where these two models are appropriate. Here we provide a link between individual-based and continuum models using a multi-scale approach in which we analyze the collective motion of a population of interacting agents in a generalized lattice-based exclusion process. For round agents that occupy a single lattice site, we find that the relevant continuum description of the system is a linear diffusion equation, whereas for elongated rod-shaped agents that occupy L adjacent lattice sites we find that the relevant continuum description is connected to the porous media equation (pme). The exponent in the nonlinear diffusivity function is related to the aspect ratio of the agents. Our work provides a physical connection between modeling collective cell spreading and the use of either the linear diffusion equation or the pme to represent cell density profiles. Results suggest that when using continuum models to represent cell population spreading, we should take care to account for variations in the cell aspect ratio because different aspect ratios lead to different continuum models.
Resumo:
A novel and comprehensive testing approach to examine the performance of gross pollutant traps (GPTs) was developed. A proprietary GPT with internal screens for capturing gross pollutants—organic matter and anthropogenic litter—was used as a case study. This work is the first investigation of its kind and provides valuable practical information for the design, selection and operation of GPTs and also the management of street waste in an urban environment. It used a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The results showed that the GPT operated efficiently until at least 68% of the screens were blocked, particularly at high flow rates. At lower flow rates, the high capture/retention performance trend was reversed. It was also found that a raised inlet GPT offered a better capture/retention performance. This finding indicates that cleaning operations could be more effectively planned in conjunction with the deterioration in GPT’s capture/retention performance.
Resumo:
This paper presents an approach to predict the operating conditions of machine based on classification and regression trees (CART) and adaptive neuro-fuzzy inference system (ANFIS) in association with direct prediction strategy for multi-step ahead prediction of time series techniques. In this study, the number of available observations and the number of predicted steps are initially determined by using false nearest neighbor method and auto mutual information technique, respectively. These values are subsequently utilized as inputs for prediction models to forecast the future values of the machines’ operating conditions. The performance of the proposed approach is then evaluated by using real trending data of low methane compressor. A comparative study of the predicted results obtained from CART and ANFIS models is also carried out to appraise the prediction capability of these models. The results show that the ANFIS prediction model can track the change in machine conditions and has the potential for using as a tool to machine fault prognosis.
Resumo:
A range of terms is used in Australian higher education institutions to describe learning approaches and teaching models that provide students with opportunities to engage in learning connected to the world of work. The umbrella term currently being used widely is Work Integrated Learning (WIL). The common aim of approaches captured under the term WIL is to integrate discipline specific knowledge learnt in university setting with that learnt in the practice of work through purposefully designed curriculum. In endeavours to extend WIL opportunities for students, universities are currently exploring authentic learning experiences, both within and outside of university settings. Some universities describe these approaches as ‘real world learning’ or ‘professional learning’. Others refer to ‘social engagement’ with the community and focus on building social capital and citizenship through curriculum design that enables students to engage with the professions through a range of learning experiences. This chapter discusses the context for, the scope, purposes, characteristics and effectiveness of WIL across Australian universities as derived from a national scoping study. This study, undertaken in response to a high level of interest in WIL, involved data collection from academic and professional staff, and students at nearly all Australian universities. Participants in the study consistently reported the benefits, especially in relation to the student learning experience. Responses highlight the importance of strong partnerships between stakeholders to facilitate effective learning outcomes and a range of issues that shape the quality of approaches and models being adopted, in promoting professional learning.
Resumo:
Concerns raised in educational reports about school science in terms of students. outcomes and attitudes, as well as science teaching practices prompted investigation into science learning and teaching practices at the foundational level of school science. Without science content and process knowledge, understanding issues of modern society and active participation in decision-making is difficult. This study contended that a focus on the development of the language of science could enable learners to engage more effectively in learning science and enhance their interest and attitudes towards science. Furthermore, it argued that explicit teaching practices where science language is modelled and scaffolded would facilitate the learning of science by young children at the beginning of their formal schooling. This study aimed to investigate science language development at the foundational level of school science learning in the preparatory-school with students aged five and six years. It focussed on the language of science and science teaching practices in early childhood. In particular, the study focussed on the capacity for young students to engage with and understand science language. Previous research suggests that students have difficulty with the language of science most likely because of the complexities and ambiguities of science language. Furthermore, literature indicates that tensions transpire between traditional science teaching practices and accepted early childhood teaching practices. This contention prompted investigation into means and models of pedagogy for learning foundational science language, knowledge and processes in early childhood. This study was positioned within qualitative assumptions of research and reported via descriptive case study. It was located in a preparatory-school classroom with the class teacher, teacher-aide, and nineteen students aged four and five years who participated with the researcher in the study. Basil Bernstein.s pedagogical theory coupled with Halliday.s Systemic Functional Linguistics (SFL) framed an examination of science pedagogical practices for early childhood science learning. Students. science learning outcomes were gauged by focussing a Hallydayan lens on their oral and reflective language during 12 science-focussed episodes of teaching. Data were collected throughout the 12 episodes. Data included video and audio-taped science activities, student artefacts, journal and anecdotal records, semi-structured interviews and photographs. Data were analysed according to Bernstein.s visible and invisible pedagogies and performance and competence models. Additionally, Halliday.s SFL provided the resource to examine teacher and student language to determine teacher/student interpersonal relationships as well as specialised science and everyday language used in teacher and student science talk. Their analysis established the socio-linguistic characteristics that promoted science competencies in young children. An analysis of the data identified those teaching practices that facilitate young children.s acquisition of science meanings. Positive indications for modelling science language and science text types to young children have emerged. Teaching within the studied setting diverged from perceived notions of common early childhood practices and the benefits of dynamic shifting pedagogies were validated. Significantly, young students demonstrated use of particular specialised components of school-science language in terms of science language features and vocabulary. As well, their use of language demonstrated the students. knowledge of science concepts, processes and text types. The young students made sense of science phenomena through their incorporation of a variety of science language and text-types in explanations during both teacher-directed and independent situations. The study informs early childhood science practices as well as practices for foundational school science teaching and learning. It has exposed implications for science education policy, curriculum and practices. It supports other findings in relation to the capabilities of young students. The study contributes to Systemic Functional Linguistic theory through the development of a specific resource to determine the technicality of teacher language used in teaching young students. Furthermore, the study contributes to methodology practices relating to Bernsteinian theoretical perspectives and has demonstrated new ways of depicting and reporting teaching practices. It provides an analytical tool which couples Bernsteinian and Hallidayan theoretical perspectives. Ultimately, it defines directions for further research in terms of foundation science language learning, ongoing learning of the language of science and learning science, science teaching and learning practices, specifically in foundational school science, and relationships between home and school science language experiences.
Resumo:
Although there are widely accepted and utilized models and frameworks for nondirective counseling (NDC), there is little in the way of tools or instruments designed to assist in determining whether or not a specific episode of counseling is consistent with the stated model or framework. The Counseling Progress and Depth Rating Instrument (CPDRI) was developed to evaluate counselor integrity in the use of Egan's skilled helper model in online counseling. The instrument was found to have sound internal consistency, good interrater reliability, and good face and convergent validity. The CPDRI is, therefore, proposed as a useful tool to facilitate investigation of the degree to which counselors adhere to and apply a widely used approach to NDC
Resumo:
In this chapter I position the iPhone as a “moment” in the history of cultural technologies. Drawing predominantly on advertising materials and public conversations about other "moments" in the history of personal computing and focusing on Apple’s role in this history, I argue that the design philosophy, marketing, and business models behind the iPhone (and now the iPad) have decisively reframed the values of usability that underpin software and interface design in the consumer technology industry, marking a distinctive shift in the history and contested futures of digital culture.
Resumo:
Photochemistry has made significant contributions to our understanding of many important natural processes as well as the scientific discoveries of the man-made world. The measurements from such studies are often complex and may require advanced data interpretation with the use of multivariate or chemometrics methods. In general, such methods have been applied successfully for data display, classification, multivariate curve resolution and prediction in analytical chemistry, environmental chemistry, engineering, medical research and industry. However, in photochemistry, by comparison, applications of such multivariate approaches were found to be less frequent although a variety of methods have been used, especially with spectroscopic photochemical applications. The methods include Principal Component Analysis (PCA; data display), Partial Least Squares (PLS; prediction), Artificial Neural Networks (ANN; prediction) and several models for multivariate curve resolution related to Parallel Factor Analysis (PARAFAC; decomposition of complex responses). Applications of such methods are discussed in this overview and typical examples include photodegradation of herbicides, prediction of antibiotics in human fluids (fluorescence spectroscopy), non-destructive in- and on-line monitoring (near infrared spectroscopy) and fast-time resolution of spectroscopic signals from photochemical reactions. It is also quite clear from the literature that the scope of spectroscopic photochemistry was enhanced by the application of chemometrics. To highlight and encourage further applications of chemometrics in photochemistry, several additional chemometrics approaches are discussed using data collected by the authors. The use of a PCA biplot is illustrated with an analysis of a matrix containing data on the performance of photocatalysts developed for water splitting and hydrogen production. In addition, the applications of the Multi-Criteria Decision Making (MCDM) ranking methods and Fuzzy Clustering are demonstrated with an analysis of water quality data matrix. Other examples of topics include the application of simultaneous kinetic spectroscopic methods for prediction of pesticides, and the use of response fingerprinting approach for classification of medicinal preparations. In general, the overview endeavours to emphasise the advantages of chemometrics' interpretation of multivariate photochemical data, and an Appendix of references and summaries of common and less usual chemometrics methods noted in this work, is provided. Crown Copyright © 2010.
Resumo:
BACKGROUND:Chlamydia trachomatis is a major cause of sexually transmitted disease in humans. Previous studies in both humans and animal models of chlamydial genital tract infection have suggested that the hormonal status of the genital tract epithelium at the time of exposure can influence the outcome of the chlamydial infection. We performed a whole genome transcriptional profiling study of C. trachomatis infection in ECC-1 cells under progesterone or estradiol treatment.RESULTS:Both hormone treatments caused a significant shift in the sub-set of genes expressed (25% of the transcriptome altered by more than 2-fold). Overall, estradiol treatment resulted in the down-regulation of 151 genes, including those associated with lipid and nucleotide metabolism. Of particular interest was the up-regulation in estradiol-supplemented cultures of six genes (omcB, trpB, cydA, cydB, pyk and yggV), which suggest a stress response similar to that reported previously in other models of chlamydial persistence. We also observed morphological changes consistent with a persistence response. By comparison, progesterone supplementation resulted in a general up-regulation of an energy utilising response.CONCLUSION:Our data shows for the first time, that the treatment of chlamydial host cells with key reproductive hormones such as progesterone and estradiol, results in significantly altered chlamydial gene expression profiles. It is likely that these chlamydial expression patterns are survival responses, evolved by the pathogen to enable it to overcome the host's innate immune response. The induction of chlamydial persistence is probably a key component of this survival response.
Resumo:
In late 2007, Gold Coast City Council libraries embarked on an online library project, designed to ramp up libraries’ online services to customers. As part of this project, the Young People’s team identified a need to connect with youth aged 12 to 16 in the online environment, in order to create a direct channel of communication with this market segment and encourage them to engage with the library. Blogging was identified as an appropriate means of communicating with both current and potential library customers from this age group. The Young People’s team consequently prepared a concept plan for a youth blog for launch in Children’s Book Week 2008 and are working towards development of management and administrative models and documentation and implementation of the blog itself. While many libraries have been quick to take up Web 2.0-style services, there has been little formal publication about the successes (or failures) of this type of project. Likewise, few libraries have published about the planning, management, and administration of such services. The youth blog currently in development at Gold Coast City Council libraries will be supported by a robust planning phase and will be rigorously evaluated as part of the project. This paper will report on the project (its aims, objectives and outputs), the planning process, and the evaluation activities and outcomes.
Resumo:
Concrete is commonly used as a primary construction material for tall building construction. Load bearing components such as columns and walls in concrete buildings are subjected to instantaneous and long term axial shortening caused by the time dependent effects of "shrinkage", "creep" and "elastic" deformations. Reinforcing steel content, variable concrete modulus, volume to surface area ratio of the elements and environmental conditions govern axial shortening. The impact of differential axial shortening among columns and core shear walls escalate with increasing building height. Differential axial shortening of gravity loaded elements in geometrically complex and irregular buildings result in permanent distortion and deflection of the structural frame which have a significant impact on building envelopes, building services, secondary systems and the life time serviceability and performance of a building. Existing numerical methods commonly used in design to quantify axial shortening are mainly based on elastic analytical techniques and therefore unable to capture the complexity of non-linear time dependent effect. Ambient measurements of axial shortening using vibrating wire, external mechanical strain, and electronic strain gauges are methods that are available to verify pre-estimated values from the design stage. Installing these gauges permanently embedded in or on the surface of concrete components for continuous measurements during and after construction with adequate protection is uneconomical, inconvenient and unreliable. Therefore such methods are rarely if ever used in actual practice of building construction. This research project has developed a rigorous numerical procedure that encompasses linear and non-linear time dependent phenomena for prediction of axial shortening of reinforced concrete structural components at design stage. This procedure takes into consideration (i) construction sequence, (ii) time varying values of Young's Modulus of reinforced concrete and (iii) creep and shrinkage models that account for variability resulting from environmental effects. The capabilities of the procedure are illustrated through examples. In order to update previous predictions of axial shortening during the construction and service stages of the building, this research has also developed a vibration based procedure using ambient measurements. This procedure takes into consideration the changes in vibration characteristic of structure during and after construction. The application of this procedure is illustrated through numerical examples which also highlight the features. The vibration based procedure can also be used as a tool to assess structural health/performance of key structural components in the building during construction and service life.
Resumo:
Over the past decade our understanding of foot function has increased significantly[1,2]. Our understanding of foot and ankle biomechanics appears to be directly correlated to advances in models used to assess and quantify kinematic parameters in gait. These advances in models in turn lead to greater detail in the data. However, we must consider that the level of complexity is determined by the question or task being analysed. This systematic review aims to provide a critical appraisal of commonly used marker sets and foot models to assess foot and ankle kinematics in a wide variety of clinical and research purposes.
Resumo:
Digital human modelling (DHM) has today matured from research into industrial application. In the automotive domain, DHM has become a commonly used tool in virtual prototyping and human-centred product design. While this generation of DHM supports the ergonomic evaluation of new vehicle design during early design stages of the product, by modelling anthropometry, posture, motion or predicting discomfort, the future of DHM will be dominated by CAE methods, realistic 3D design, and musculoskeletal and soft tissue modelling down to the micro-scale of molecular activity within single muscle fibres. As a driving force for DHM development, the automotive industry has traditionally used human models in the manufacturing sector (production ergonomics, e.g. assembly) and the engineering sector (product ergonomics, e.g. safety, packaging). In product ergonomics applications, DHM share many common characteristics, creating a unique subset of DHM. These models are optimised for a seated posture, interface to a vehicle seat through standardised methods and provide linkages to vehicle controls. As a tool, they need to interface with other analytic instruments and integrate into complex CAD/CAE environments. Important aspects of current DHM research are functional analysis, model integration and task simulation. Digital (virtual, analytic) prototypes or digital mock-ups (DMU) provide expanded support for testing and verification and consider task-dependent performance and motion. Beyond rigid body mechanics, soft tissue modelling is evolving to become standard in future DHM. When addressing advanced issues beyond the physical domain, for example anthropometry and biomechanics, modelling of human behaviours and skills is also integrated into DHM. Latest developments include a more comprehensive approach through implementing perceptual, cognitive and performance models, representing human behaviour on a non-physiologic level. Through integration of algorithms from the artificial intelligence domain, a vision of the virtual human is emerging.
Resumo:
The Time magazine ‘Person of theYear’ award is a venerable institution. Established by Time’s founder Henry Luce in 1927 as ‘Man of the Year’, it is an annual award given to ‘a person, couple, group, idea, place, or machine that ‘for better or for worse ... has done the most to influence the events of the year’ (Time 2002, p. 1). In 2010, the award was given to Mark Zuckerberg, the founder and CEO of the social networking site Facebook.There was, however, a strong campaign for the ‘People’s Choice’ award to be given to Julian Assange, the founder and editor-in-chief of Wikileaks, the online whistleblowing site. Earlier in the year Wikileaks had released more than 250 000 US government diplomatic cables through the internet, and the subsequent controver- sies around the actions of Wikileaks and Assange came to be known worldwide as ‘Cablegate’. The focus of this chapter is not on the implications of ‘Cablegate’ for international diplomacy, which continue to have great significance, but rather upon what the emergence of Wikileaks has meant for journalism, and whether it provides insights into the future of journalism. Both Facebook and Wikileaks, as well as social media platforms such as Twitter and YouTube, and independent media practices such as blogging, citizen journalism and crowdsourcing, are manifestations of the rise of social media, or what has also been termed web 2.0.The term ‘web 2.0’ was coined by Tim O’Reilly, and captures the rise of online social media platforms and services, that better realise the collaborative potential of digitally networked media. They do this by moving from the relatively static, top-down notions of interactivity that informed early internet development, towards more open and evolutionary models that better harness collective intelligence by enabling users to become the creators and collaborators in the development of online media content (Musser and O’Reilly 2007; Bruns 2008).
Resumo:
The establishment by the Prime Minister of the Community Business Partnerships Board along with recent taxation reform has drawn attention to corporate philanthropy in Australia. Definitions and models are needed as each of the potential partners – government, corporations and nonprofit organisations – attempts to come to grips with opportunities. The intending partners will need to determine their responsibilities and desired outcomes so that they may work effectively towards mutually beneficial working relationships. Performance indicators need to be determined, benchmarks developed and best practice promoted. A dearth of research exists in this area (Burch, 1998; Industry Commission Report, 1995; Lyons & Hocking, 1998). More exhaustive research, collection and analysis of appropriate data will aid the process. This particular research indicates a lack of understanding between corporations and nonprofit organisations. There are risks inherent in the proposed partnerships, such as inability to reach agreement, potential for increased costs, and failure to deliver by one of the partners. This paper assesses opportunities and risks, suggests topics for high level debate, and indicates models for the development of partnerships.