956 resultados para Language Models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositionality is a frequently made assumption in linguistics, and yet many human subjects reveal highly non-compositional word associations when confronted with novel concept combinations. This article will show how a non-compositional account of concept combinations can be supplied by modelling them as interacting quantum systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Australian income tax regime is generally regarded as a mechanism by which the Federal Government raises revenue, with much of the revenue raised used to support public spending programs. A prime example of this type of spending program is health care. However, a government may also decide that the private sector should provide a greater share of the nation's health care. To achieve such a policy it can bring about change through positive regulation, or it can use the taxation regime, via tax expenditures, not to raise revenue but to steer or influence individuals in its desired direction. When used for this purpose, tax expenditures steer taxpayers towards or away from certain behaviour by either imposing costs on, or providing benefits to them. Within the context of the health sector, the Australian Federal Government deploys social steering via the tax system, with the Medicare Levy Surcharge and the 30 percent Private Health Insurance Rebate intended to steer taxpayer behaviour towards the Government’s policy goal of increasing the amount of health provision through the private sector. These steering mechanisms are complemented by the ‘Lifetime Health Cover Initiative’. This article, through the lens of behavioural economics, considers the ways in which these assorted mechanisms might have been expected to operate and whether they encourage individuals to purchase private health insurance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fire safety of buildings has been recognised as very important by the building industry and the community at large. Gypsum plasterboards are widely used to protect light gauge steel frame (LSF) walls all over the world. Gypsum contains free and chemically bound water in its crystal structure. Plasterboard also contains gypsum (CaSO4.2H2O) and calcium carbonate (CaCO3). The dehydration of gypsum and the decomposition of calcium carbonate absorb heat, and thus are able to protect LSF walls from fires. Kolarkar and Mahendran (2008) developed an innovative composite wall panel system, where the insulation was sandwiched between two plasterboards to improve the thermal and structural performance of LSF wall panels under fire conditions. In order to understand the performance of gypsum plasterboards and LSF wall panels under standard fire conditions, many experiments were conducted in the Fire Research Laboratory of Queensland University of Technology (Kolarkar, 2010). Fire tests were conducted on single, double and triple layers of Type X gypsum plasterboards and load bearing LSF wall panels under standard fire conditions. However, suitable numerical models have not been developed to investigate the thermal performance of LSF walls using the innovative composite panels under standard fire conditions. Continued reliance on expensive and time consuming fire tests is not acceptable. Therefore this research developed suitable numerical models to investigate the thermal performance of both plasterboard assemblies and load bearing LSF wall panels. SAFIR, a finite element program, was used to investigate the thermal performance of gypsum plasterboard assemblies and LSF wall panels under standard fire conditions. Appropriate values of important thermal properties were proposed for plasterboards and insulations based on laboratory tests, literature review and comparisons of finite element analysis results of small scale plasterboard assemblies from this research and corresponding experimental results from Kolarkar (2010). The important thermal properties (thermal conductivity, specific heat capacity and density) of gypsum plasterboard and insulation materials were proposed as functions of temperature and used in the numerical models of load bearing LSF wall panels. Using these thermal properties, the developed finite element models were able to accurately predict the time temperature profiles of plasterboard assemblies while they predicted them reasonably well for load bearing LSF wall systems despite the many complexities that are present in these LSF wall systems under fires. This thesis presents the details of the finite element models of plasterboard assemblies and load bearing LSF wall panels including those with the composite panels developed by Kolarkar and Mahendran (2008). It examines and compares the thermal performance of composite panels developed based on different insulating materials of varying densities and thicknesses based on 11 small scale tests, and makes suitable recommendations for improved fire performance of stud wall panels protected by these composite panels. It also presents the thermal performance data of LSF wall systems and demonstrates the superior performance of LSF wall systems using the composite panels. Using the developed finite element of models of LSF walls, this thesis has proposed new LSF wall systems with increased fire rating. The developed finite element models are particularly useful in comparing the thermal performance of different wall panel systems without time consuming and expensive fire tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital human models (DHM) have evolved as useful tools for ergonomic workplace design and product development, and found in various industries and education. DHM systems which dominate the market were developed for specific purposes and differ significantly, which is not only reflected in non-compatible results of DHM simulations, but also provoking misunderstanding of how DHM simulations relate to real world problems. While DHM developers are restricted by uncertainty about the user need and lack of model data related standards, users are confined to one specific product and cannot exchange results, or upgrade to another DHM system, as their previous results would be rendered worthless. Furthermore, origin and validity of anthropometric and biomechanical data is not transparent to the user. The lack of standardisation in DHM systems has become a major roadblock in further system development, affecting all stakeholders in the DHM industry. Evidently, a framework for standardising digital human models is necessary to overcome current obstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Individual-based models describing the migration and proliferation of a population of cells frequently restrict the cells to a predefined lattice. An implicit assumption of this type of lattice based model is that a proliferative population will always eventually fill the lattice. Here we develop a new lattice-free individual-based model that incorporates cell-to-cell crowding effects. We also derive approximate mean-field descriptions for the lattice-free model in two special cases motivated by commonly used experimental setups. Lattice-free simulation results are compared to these mean-field descriptions and to a corresponding lattice-based model. Data from a proliferation experiment is used to estimate the parameters for the new model, including the cell proliferation rate, showing that the model fits the data well. An important aspect of the lattice-free model is that the confluent cell density is not predefined, as with lattice-based models, but an emergent model property. As a consequence of the more realistic, irregular configuration of cells in the lattice-free model, the population growth rate is much slower at high cell densities and the population cannot reach the same confluent density as an equivalent lattice-based model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The rapidly changing role of capital city airports has placed demands on surrounding infrastructure. The need for infrastructure management and coordination is increasing as airports and cities grow and share common infrastructure frameworks. The purpose of this paper is to document the changing context in Australia, where the privatisation of airports has stimulated considerable land development with resulting pressures on surrounding infrastructure provision. It aims to describe a tool that is being developed to support decision-making between various stakeholders in the airport region. The use of planning support systems improves both communication and data transfer between stakeholders and provides a foundation for complex decisions on infrastructure. Design/methodology/approach – The research uses a case study approach and focuses on Brisbane International Airport and Brisbane City Council. The research is primarily descriptive and provides an empirical assessment of the challenges of developing and implementing planning support systems as a tool for governance and decision-making. Findings – The research assesses the challenges in implementing a common data platform for stakeholders. Agency data platforms and models, traditional roles in infrastructure planning, and integrating similar data platforms all provide barriers to sharing a common language. The use of a decision support system has to be shared by all stakeholders with a common platform that can be versatile enough to support scenarios and changing conditions. The use of iPadss for scenario modelling provides stakeholders the opportunity to interact, compare scenarios and views, and react with the modellers to explore other options. Originality/value – The research confirms that planning support systems have to be accessible and interactive by their users. The Airport City concept is a new and evolving focus for airport development and will place continuing pressure on infrastructure servicing. A coordinated and efficient approach to infrastructure decision-making is critical, and an interactive planning support system that can model infrastructure scenarios provides a sound tool for governance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presently, global rates of skin cancers induced by ultraviolet radiation (UVR) exposure are on the rise. In view of this, current knowledge gaps in the biology of photocarcinogenesis and skin cancer progression urgently need to be addressed. One factor that has limited skin cancer research has been the need for a reproducible and physiologically-relevant model able to represent the complexity of human skin. This review outlines the main currently-used in vitro models of UVR-induced skin damage. This includes the use of conventional two-dimensional cell culture techniques and the major animal models that have been employed in photobiology and photocarcinogenesis research. Additionally, the progression towards the use of cultured skin explants and tissue-engineered skin constructs, and their utility as models of native skin's responses to UVR are described. The inherent advantages and disadvantages of these in vitro systems are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The internet is transforming possibilities for creative interaction, experimentation and cultural consumption in China and raising important questions about the role that “publishers” might play in an open and networked digital world. The purpose of this paper is to consider the role that copyright is playing in the growth of a publishing industry that is being “born digital”. Design/methodology/approach – The paper approaches online literature as an example of a creative industry that is generating value for a wider creative economy through its social network market functions. It builds on the social network market definition of the creative industries proposed by Potts et al. and uses this definition to interrogate the role that copyright plays in a rapidly-evolving creative economy. Findings – The rapid growth of a market for crowd-sourced content is combining with growing commercial freedom in cultural space to produce a dynamic landscape of business model experimentation. Using the social web to engage audiences, generate content, establish popularity and build reputation and then converting those assets into profit through less networked channels appears to be a driving strategy in the expansion of wider creative industries markets in China. Originality/value – At a moment when publishing industries all over the world are struggling to come to terms with digital technology, the emergence of a rapidly-growing area of publishing that is being born digital offers important clues about the future of publishing and what social network markets might mean for the role of copyright in a digital age.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ocean processes are complex and have high variability in both time and space. Thus, ocean scientists must collect data over long time periods to obtain a synoptic view of ocean processes and resolve their spatiotemporal variability. One way to perform these persistent observations is to utilise an autonomous vehicle that can remain on deployment for long time periods. However, such vehicles are generally underactuated and slow moving. A challenge for persistent monitoring with these vehicles is dealing with currents while executing a prescribed path or mission. Here we present a path planning method for persistent monitoring that exploits ocean currents to increase navigational accuracy and reduce energy consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motorcycles are overrepresented in road traffic crashes and particularly vulnerable at signalized intersections. The objective of this study is to identify causal factors affecting the motorcycle crashes at both four-legged and T signalized intersections. Treating the data in time-series cross-section panels, this study explores different Hierarchical Poisson models and found that the model allowing autoregressive lag 1 dependent specification in the error term is the most suitable. Results show that the number of lanes at the four-legged signalized intersections significantly increases motorcycle crashes largely because of the higher exposure resulting from higher motorcycle accumulation at the stop line. Furthermore, the presence of a wide median and an uncontrolled left-turn lane at major roadways of four-legged intersections exacerbate this potential hazard. For T signalized intersections, the presence of exclusive right-turn lane at both major and minor roadways and an uncontrolled left-turn lane at major roadways of T intersections increases motorcycle crashes. Motorcycle crashes increase on high-speed roadways because they are more vulnerable and less likely to react in time during conflicts. The presence of red light cameras reduces motorcycle crashes significantly for both four-legged and T intersections. With the red-light camera, motorcycles are less exposed to conflicts because it is observed that they are more disciplined in queuing at the stop line and less likely to jump start at the start of green.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motorcyclists are the most crash-prone road-user group in many Asian countries including Singapore; however, factors influencing motorcycle crashes are still not well understood. This study examines the effects of various roadway characteristics, traffic control measures and environmental factors on motorcycle crashes at different location types including expressways and intersections. Using techniques of categorical data analysis, this study has developed a set of log-linear models to investigate multi-vehicle motorcycle crashes in Singapore. Motorcycle crash risks in different circumstances have been calculated after controlling for the exposure estimated by the induced exposure technique. Results show that night-time influence increases crash risks of motorcycles particularly during merging and diverging manoeuvres on expressways, and turning manoeuvres at intersections. Riders appear to exercise more care while riding on wet road surfaces particularly during night. Many hazardous interactions at intersections tend to be related to the failure of drivers to notice a motorcycle as well as to judge correctly the speed/distance of an oncoming motorcycle. Road side conflicts due to stopping/waiting vehicles and interactions with opposing traffic on undivided roads have been found to be as detrimental factors on motorcycle safety along arterial, main and local roads away from intersections. Based on the findings of this study, several targeted countermeasures in the form of legislations, rider training, and safety awareness programmes have been recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional crash prediction models, such as generalized linear regression models, are incapable of taking into account the multilevel data structure, which extensively exists in crash data. Disregarding the possible within-group correlations can lead to the production of models giving unreliable and biased estimates of unknowns. This study innovatively proposes a -level hierarchy, viz. (Geographic region level – Traffic site level – Traffic crash level – Driver-vehicle unit level – Vehicle-occupant level) Time level, to establish a general form of multilevel data structure in traffic safety analysis. To properly model the potential cross-group heterogeneity due to the multilevel data structure, a framework of Bayesian hierarchical models that explicitly specify multilevel structure and correctly yield parameter estimates is introduced and recommended. The proposed method is illustrated in an individual-severity analysis of intersection crashes using the Singapore crash records. This study proved the importance of accounting for the within-group correlations and demonstrated the flexibilities and effectiveness of the Bayesian hierarchical method in modeling multilevel structure of traffic crash data.