856 resultados para Socio-ecological models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The QUT Extreme Science and Engineering program provides free hands-on workshops to schools, presented by scientists and engineers to students from prep to year 12 in their own classrooms. The workshops are tied to the school curriculum and give students access to professional quality instruments, helping to stimulate their interest in science and engineering, with the aim of generating a greater take up of STEM related subjects in the senior high school years. In addition to engaging students in activities, workshop presenters provide role models of both genders, helping to breakdown preconceived ideas of the type of person who becomes a scientist or engineer and demystifying the university experience. The Extreme Science and Engineering vans have been running for 10 years and as such demonstrate a sustainable and reproducible model for schools engagement. With funding provided through QUT’s Widening Participation Equity initiative (HEPPP funded) the vans which averaged 120 school visits each year has increased to 150+ visits in 2010. Additionally 100+ workshops (hands-on and career focused) have been presented to students from low socio-economic status schools, on the three QUT campuses in 2011. While this is designed as a long-term initiative the short term results have been very promising, with 3000 students attending the workshops in the first six months and teacher and students feedback has been overwhelmingly positive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The three studies in this thesis focus on happiness and age and seek to contribute to our understanding of happiness change over the lifetime. The first study contributes by offering an explanation for what was evolving to a ‘stylised fact’ in the economics literature, the U-shape of happiness in age. No U-shape is evident if one makes a visual inspection of the age happiness relationship in the German socio-economic panel data, and, it seems counter-intuitive that we just have to wait until we get old to be happy. Eliminating the very young, the very old, and the first timers from the analysis did not explain away regression results supporting the U-shape of happiness in age, but fixed effect analysis did. Analysis revealed found that reverse causality arising from time-invariant individual traits explained the U-shape of happiness in age in the German population, and the results were robust across six econometric methods. Robustness was added to the German fixed effect finding by replicating it with the Australian and the British socio-economic panel data sets. During analysis of the German data an unexpected finding emerged, an exceedingly large negative linear effect of age on happiness in fixed-effect regressions. There is a large self-reported happiness decline by those who remain in the German panel. A similar decline over time was not evident in the Australian or the British data. After testing away age, time and cohort effects, a time-in-panel effect was found. Germans who remain in the panel for longer progressively report lower levels of happiness. Because time-in-panel effects have not been included in happiness regression specifications, our estimates may be biased; perhaps some economics of the happiness studies, that used German panel data, need revisiting. The second study builds upon the fixed-effect finding of the first study and extends our view of lifetime happiness to a cohort little visited by economists, children. Initial analysis extends our view of lifetime happiness beyond adulthood and revealed a happiness decline in adolescent (15 to 23 year-old) Australians that is twice the size of the happiness decline we see in older Australians (75 to 86 yearolds), who we expect to be unhappy due to declining income, failing health and the onset of death. To resolve a difference of opinion in the literature as to whether childhood happiness decreases, increases, or remains flat in age; survey instruments and an Internet-based survey were developed and used to collect data from four hundred 9 to 14 year-old Australian children. Applying the data to a Model of Childhood Happiness revealed that the natural environment life-satisfaction domain factor did not have a significant effect on childhood happiness. However, the children’s school environment and interactions with friends life-satisfaction domain factors explained over half a steep decline in childhood happiness that is three times larger than what we see in older Australians. Adding personality to the model revealed what we expect to see with adults, extraverted children are happier, but unexpectedly, so are conscientious children. With the steep decline in the happiness of young Australians revealed and explanations offered, the third study builds on the time-invariant individual trait finding from the first study by applying the Australian panel data to an Aggregate Model of Average Happiness over the lifetime. The model’s independent variable is the stress that arises from the interaction between personality and the life event shocks that affect individuals and peers throughout their lives. Interestingly, a graphic depiction of the stress in age relationship reveals an inverse U-shape; an inverse U-shape that looks like the opposite of the U-shape of happiness in age we saw in the first study. The stress arising from life event shocks is found to explain much of the change in average happiness over a lifetime. With the policy recommendations of economists potentially invoking unexpected changes in our lives, the ensuing stress and resulting (un)happiness warrant consideration before economists make policy recommendations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question of under what conditions conceptual representation is compositional remains debatable within cognitive science. This paper proposes a well developed mathematical apparatus for a probabilistic representation of concepts, drawing upon methods developed in quantum theory to propose a formal test that can determine whether a specific conceptual combination is compositional, or not. This test examines a joint probability distribution modeling the combination, asking whether or not it is factorizable. Empirical studies indicate that some combinations should be considered non-compositionally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital human models (DHM) have evolved as useful tools for ergonomic workplace design and product development, and found in various industries and education. DHM systems which dominate the market were developed for specific purposes and differ significantly, which is not only reflected in non-compatible results of DHM simulations, but also provoking misunderstanding of how DHM simulations relate to real world problems. While DHM developers are restricted by uncertainty about the user need and lack of model data related standards, users are confined to one specific product and cannot exchange results, or upgrade to another DHM system, as their previous results would be rendered worthless. Furthermore, origin and validity of anthropometric and biomechanical data is not transparent to the user. The lack of standardisation in DHM systems has become a major roadblock in further system development, affecting all stakeholders in the DHM industry. Evidently, a framework for standardising digital human models is necessary to overcome current obstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Individual-based models describing the migration and proliferation of a population of cells frequently restrict the cells to a predefined lattice. An implicit assumption of this type of lattice based model is that a proliferative population will always eventually fill the lattice. Here we develop a new lattice-free individual-based model that incorporates cell-to-cell crowding effects. We also derive approximate mean-field descriptions for the lattice-free model in two special cases motivated by commonly used experimental setups. Lattice-free simulation results are compared to these mean-field descriptions and to a corresponding lattice-based model. Data from a proliferation experiment is used to estimate the parameters for the new model, including the cell proliferation rate, showing that the model fits the data well. An important aspect of the lattice-free model is that the confluent cell density is not predefined, as with lattice-based models, but an emergent model property. As a consequence of the more realistic, irregular configuration of cells in the lattice-free model, the population growth rate is much slower at high cell densities and the population cannot reach the same confluent density as an equivalent lattice-based model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of actively managing and analyzing business processes is acknowledged more than ever in organizations nowadays. Business processes form an essential part of an organization and their ap-plication areas are manifold. Most organizations keep records of various activities that have been carried out for auditing purposes, but they are rarely used for analysis purposes. This paper describes the design and implementation of a process analysis tool that replays, analyzes and visualizes a variety of performance metrics using a process definition and its execution logs. Performing performance analysis on existing and planned process models offers a great way for organizations to detect bottlenecks within their processes and allow them to make more effective process improvement decisions. Our technique is applied to processes modeled in the YAWL language. Execution logs of process instances are compared against the corresponding YAWL process model and replayed in a robust manner, taking into account any noise in the logs. Finally, performance characteristics, obtained from replaying the log in the model, are projected onto the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the late 20th century it was proposed that a design aesthetic reflecting current ecological concerns was required within the overall domain of the built environment and specifically within landscape design. To address this, some authors suggested various theoretical frameworks upon which such an aesthetic could be based. Within these frameworks there was an underlying theme that the patterns and processes of Nature may have the potential to form this aesthetic — an aesthetic based on fractal rather than Euclidean geometry. In order to understand how fractal geometry, described as the geometry of Nature, could become the referent for a design aesthetic, this research examines the mathematical concepts of fractal Geometry, and the underlying philosophical concepts behind the terms ‘Nature’ and ‘aesthetics’. The findings of this initial research meant that a new definition of Nature was required in order to overcome the barrier presented by the western philosophical Nature¯culture duality. This new definition of Nature is based on the type and use of energy. Similarly, it became clear that current usage of the term aesthetics has more in common with the term ‘style’ than with its correct philosophical meaning. The aesthetic philosophy of both art and the environment recognises different aesthetic criteria related to either the subject or the object, such as: aesthetic experience; aesthetic attitude; aesthetic value; aesthetic object; and aesthetic properties. Given these criteria, and the fact that the concept of aesthetics is still an active and ongoing philosophical discussion, this work focuses on the criteria of aesthetic properties and the aesthetic experience or response they engender. The examination of fractal geometry revealed that it is a geometry based on scale rather than on the location of a point within a three-dimensional space. This enables fractal geometry to describe the complex forms and patterns created through the processes of Wild Nature. Although fractal geometry has been used to analyse the patterns of built environments from a plan perspective, it became clear from the initial review of the literature that there was a total knowledge vacuum about the fractal properties of environments experienced every day by people as they move through them. To overcome this, 21 different landscapes that ranged from highly developed city centres to relatively untouched landscapes of Wild Nature have been analysed. Although this work shows that the fractal dimension can be used to differentiate between overall landscape forms, it also shows that by itself it cannot differentiate between all images analysed. To overcome this two further parameters based on the underlying structural geometry embedded within the landscape are discussed. These parameters are the Power Spectrum Median Amplitude and the Level of Isotropy within the Fourier Power Spectrum. Based on the detailed analysis of these parameters a greater understanding of the structural properties of landscapes has been gained. With this understanding, this research has moved the field of landscape design a step close to being able to articulate a new aesthetic for ecological design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presently, global rates of skin cancers induced by ultraviolet radiation (UVR) exposure are on the rise. In view of this, current knowledge gaps in the biology of photocarcinogenesis and skin cancer progression urgently need to be addressed. One factor that has limited skin cancer research has been the need for a reproducible and physiologically-relevant model able to represent the complexity of human skin. This review outlines the main currently-used in vitro models of UVR-induced skin damage. This includes the use of conventional two-dimensional cell culture techniques and the major animal models that have been employed in photobiology and photocarcinogenesis research. Additionally, the progression towards the use of cultured skin explants and tissue-engineered skin constructs, and their utility as models of native skin's responses to UVR are described. The inherent advantages and disadvantages of these in vitro systems are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT Objectives: To investigate the effect of hot and cold temperatures on ambulance attendances. Design: An ecological time series study. Setting and participants: The study was conducted in Brisbane, Australia. We collected information on 783 935 daily ambulance attendances, along with data of associated meteorological variables and air pollutants, for the period of 2000–2007. Outcome measures: The total number of ambulance attendances was examined, along with those related to cardiovascular, respiratory and other non-traumatic conditions. Generalised additive models were used to assess the relationship between daily mean temperature and the number of ambulance attendances. Results: There were statistically significant relationships between mean temperature and ambulance attendances for all categories. Acute heat effects were found with a 1.17% (95% CI: 0.86%, 1.48%) increase in total attendances for 1 °C increase above threshold (0–1 days lag). Cold effects were delayed and longer lasting with a 1.30% (0.87%, 1.73%) increase in total attendances for a 1 °C decrease below the threshold (2–15 days lag). Harvesting was observed following initial acute periods of heat effects, but not for cold effects. Conclusions: This study shows that both hot and cold temperatures led to increases in ambulance attendances for different medical conditions. Our findings support the notion that ambulance attendance records are a valid and timely source of data for use in the development of local weather/health early warning systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To investigate the effect of hot and cold temperatures on ambulance attendances. Design: An ecological time series study. Setting and participants: The study was conducted in Brisbane, Australia. We collected information on 783 935 daily ambulance attendances, along with data of associated meteorological variables and air pollutants, for the period of 2000–2007. Outcome measures: The total number of ambulance attendances was examined, along with those related to cardiovascular, respiratory and other non-traumatic conditions. Generalised additive models were used to assess the relationship between daily mean temperature and the number of ambulance attendances. Results: There were statistically significant relationships between mean temperature and ambulance attendances for all categories. Acute heat effects were found with a 1.17% (95% CI: 0.86%, 1.48%) increase in total attendances for 1 °C increase above threshold (0–1 days lag). Cold effects were delayed and longer lasting with a 1.30% (0.87%, 1.73%) increase in total attendances for a 1 °C decrease below the threshold (2–15 days lag). Harvesting was observed following initial acute periods of heat effects, but not for cold effects. Conclusions: This study shows that both hot and cold temperatures led to increases in ambulance attendances for different medical conditions. Our findings support the notion that ambulance attendance records are a valid and timely source of data for use in the development of local weather/health early warning systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The internet is transforming possibilities for creative interaction, experimentation and cultural consumption in China and raising important questions about the role that “publishers” might play in an open and networked digital world. The purpose of this paper is to consider the role that copyright is playing in the growth of a publishing industry that is being “born digital”. Design/methodology/approach – The paper approaches online literature as an example of a creative industry that is generating value for a wider creative economy through its social network market functions. It builds on the social network market definition of the creative industries proposed by Potts et al. and uses this definition to interrogate the role that copyright plays in a rapidly-evolving creative economy. Findings – The rapid growth of a market for crowd-sourced content is combining with growing commercial freedom in cultural space to produce a dynamic landscape of business model experimentation. Using the social web to engage audiences, generate content, establish popularity and build reputation and then converting those assets into profit through less networked channels appears to be a driving strategy in the expansion of wider creative industries markets in China. Originality/value – At a moment when publishing industries all over the world are struggling to come to terms with digital technology, the emergence of a rapidly-growing area of publishing that is being born digital offers important clues about the future of publishing and what social network markets might mean for the role of copyright in a digital age.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The position of housing demand and supply is not consistent. The Australian situation counters the experience demonstrated in many other parts of the world in the aftermath of the Global Financial Crisis, with residential housing prices proving particularly resilient. A seemingly inexorable housing demand remains a critical issue affecting the socio-economic landscape. Underpinned by high levels of population growth fuelled by immigration, and further buoyed by sustained historically low interest rates, increasing income levels, and increased government assistance for first home buyers, this strong housing demand level ensures problems related to housing affordability continue almost unabated. A significant, but less visible factor impacting housing affordability relates to holding costs. Although only one contributor in the housing affordability matrix, the nature and extent of holding cost impact requires elucidation: for example, the computation and methodology behind the calculation of holding costs varies widely - and in some instances completely ignored. In addition, ambiguity exists in terms of the inclusion of various elements that comprise holding costs, thereby affecting the assessment of their relative contribution. Such anomalies may be explained by considering that assessment is conducted over time in an ever-changing environment. A strong relationship with opportunity cost - in turn dependant inter alia upon prevailing inflation and / or interest rates - adds further complexity. By extending research in the general area of housing affordability, this thesis seeks to provide a detailed investigation of those elements related to holding costs specifically in the context of midsized (i.e. between 15-200 lots) greenfield residential property developments in South East Queensland. With the dimensions of holding costs and their influence over housing affordability determined, the null hypothesis H0 that holding costs are not passed on can be addressed. Arriving at these conclusions involves the development of robust economic and econometric models which seek to clarify the componentry impacts of holding cost elements. An explanatory sequential design research methodology has been adopted, whereby the compilation and analysis of quantitative data and the development of an economic model is informed by the subsequent collection and analysis of primarily qualitative data derived from surveying development related organisations. Ultimately, there are significant policy implications in relation to the framework used in Australian jurisdictions that promote, retain, or otherwise maximise, the opportunities for affordable housing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.