780 resultados para European copyright code
Resumo:
The management of models over time in many domains requires different constraints to apply to some parts of the model as it evolves. Using EMF and its meta-language Ecore, the development of model management code and tools usually relies on the meta- model having some constraints, such as attribute and reference cardinalities and changeability, set in the least constrained way that any model user will require. Stronger versions of these constraints can then be enforced in code, or by attaching additional constraint expressions, and their evaluations engines, to the generated model code. We propose a mechanism that allows for variations to the constraining meta-attributes of metamodels, to allow enforcement of different constraints at different lifecycle stages of a model. We then discuss the implementation choices within EMF to support the validation of a state-specific metamodel on model graphs when changing states, as well as the enforcement of state-specific constraints when executing model change operations.
Resumo:
A study of crowds drawn to Australian football matches in colonial Victoria illuminates key aspects of the code's genesis, development and popularity. Australian football was codified by a middle-class elite that, as in Britain, created forms of mass entertainment that were consistent with the kind of industrial capitalist society they were attempting to organise. But the 'lower orders' were inculcated with traditional British folkways in matters of popular amusement, and introduced a style of 'barracking' for this new code that resisted the hegemony of the elite football administrators. By the end of the colonial period Australian football was firmly entrenched as a site of contestation between plebeian and bourgeois codes of spectating that reflected the social and ethnic diversity of the clubs making up the Victorian competition. Australian football thereby offers a classic vignette in the larger history of 'resistance through ritual'.
Resumo:
The last two decades have seen the application of six sigma methodologies in many manufacturing and also some service industries. Six sigma’s success in manufacturing is well published. But the same cannot be said about its implementation in services. Applying six sigma to services is still limited to only a small number of services. This paper reviews the application of six sigma in service industries. Emphasis is given to application issues such as what are necessary critical success factors and key performance indicators in order for a project to be successful. A pilot study was carried out in order to highlight the issues discussed. Regardless of the service that is provided, a number of guidelines can be commonly applied to varying types of services. The aim of this paper is to help widen the scope of six sigma application in services.
Resumo:
Marketers spend considerable resources to motivate people to consume their products and services as a means of goal attainment (Bagozzi and Dholakia, 1999). Why people increase, decrease, or stop consuming some products is based largely on how well they perceive they are doing in pursuit of their goals (Carver and Scheier, 1992). Yet despite the importance for marketers in understanding how current performance influences a consumer’s future efforts, this topic has received little attention in marketing research. Goal researchers generally agree that feedback about how well or how poorly people are doing in achieving their goals affects their motivation (Bandura and Cervone, 1986; Locke and Latham, 1990). Yet there is less agreement about whether positive and negative performance feedback increases or decreases future effort (Locke and Latham, 1990). For instance, while a customer of a gym might cancel his membership after receiving negative feedback about his fitness, the same negative feedback might cause another customer to visit the gym more often to achieve better results. A similar logic can apply to many products and services from the use of cosmetics to investing in mutual funds. The present research offers managers key insights into how to engage customers and keep them motivated. Given that connecting customers with the company is a top research priority for managers (Marketing Science Institute, 2006), this article provides suggestions for performance metrics including four questions that managers can use to apply the findings.
Resumo:
Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.
Resumo:
The need for accessible housing in Australia is acute. Both government and the community service sector recognise the importance of well designed accessible housing to optimise the integration of older people and people with disability, to encourage a prudent use of scarce health and community services and to enhance the liveability of our cities. In 2010, the housing industry, negotiated with the Australian Government and community representatives to adopt a nationally consistent voluntary code (Livable Housing Design) and a strategy to provide minimal level of accessibility in all new housing by 2020. Evidence from the implementation of such programs in the United Kingdom and USA, however, serves to question whether this aspirational goal can be achieved through voluntary codes. Minimal demand at the point of new sale, and problems in the production of housing to the required standards have raised questions regarding the application of program principles in the context of a voluntary code. In addressing the latter issue, this paper presents early findings from the analysis of qualitative interviews conducted with developers, builders and designers in various housing contexts. It identifies their “logics in use” in the production of housing in response to Livable Housing Design’s voluntary code and indicates factors that are likely to assist and impede the attainment of the 2020 aspirational goal.
Resumo:
This paper focuses on information sharing with key suppliers and seeks to explore the factors that might influence its extent and depth. We also investigate how information sharing affects a company’s performance with regards to resource usage, output, and flexibility. Drawing from transaction cost- and contingency theories, several factors, namely environmental uncertainty, demand uncertainty, dependency and, the product life cycle stage are proposed to explain the level of information shared with key suppliers. We develop a model where information sharing mediates the (contingent) factors and company performance. A mail survey was used to collect data from Finnish and Swedish companies. Partial Least Squares analysis was separately performed for each country (n=119, n=102). There was consistent evidence that environmental uncertainty, demand uncertainty and supplier/buyer dependency had explanatory power, whereas no significance was found for the product life cycle stage. The results also confirm previous studies by providing support for a positive relationship between information sharing and performance, where output performance was found to be the most strongly related
An experimental and computational investigation of performance of Green Gully for reusing stormwater
Resumo:
A new stormwater quality improvement device (SQID) called ‘Green Gully’ has been designed and developed in this study with an aim to re-using stormwater for irrigating plants and trees. The main purpose of the Green Gully is to collect road runoff/stormwater, make it suitable for irrigation and provide an automated network system for watering roadside plants and irrigational areas. This paper presents the design and development of Green Gully along with experimental and computational investigations of the performance of Green Gully. Performance (in the form of efficiency, i.e. the percentage of water flow through the gully grate) was experimentally determined using a gully model in the laboratory first, then a three dimensional numerical model was developed and simulated to predict the efficiency of Green Gully as a function of flow rate. Computational Fluid Dynamics (CFD) code FLUENT was used for the simulation. GAMBIT was used for geometry creation and mesh generation. Experimental and simulation results are discussed and compared in this paper. The predicted efficiency was compared with the laboratory measured efficiency. It was found that the simulated results are in good agreement with the experimental results.
Resumo:
Embedded real-time programs rely on external interrupts to respond to events in their physical environment in a timely fashion. Formal program verification theories, such as the refinement calculus, are intended for development of sequential, block-structured code and do not allow for asynchronous control constructs such as interrupt service routines. In this article we extend the refinement calculus to support formal development of interrupt-dependent programs. To do this we: use a timed semantics, to support reasoning about the occurrence of interrupts within bounded time intervals; introduce a restricted form of concurrency, to model composition of interrupt service routines with the main program they may preempt; introduce a semantics for shared variables, to model contention for variables accessed by both interrupt service routines and the main program; and use real-time scheduling theory to discharge timing requirements on interruptible program code.
Resumo:
A better understanding of the behaviour of prepared cane and bagasse, especially the ability to model the mechanical behaviour of bagasse as it is squeezed in a milling unit to extract juice, would help identify how to improve the current milling process; for example to reduce final bagasse moisture. Previous investigations have proven with certainty that juice flow through bagasse obeys Darcy’s permeability law, that the grip of the rough surface of the grooves on the bagasse can be represented by the Mohr- Coulomb failure criterion for soils, and that the internal mechanical behaviour of the bagasse can be represented by critical state behaviour similar to that of sand and clay. Current Finite Element Models (FEM) available in commercial software have adequate permeability models. However, commercial software does not contain an adequate mechanical model for bagasse. Progress has been made in the last ten years towards implementing a mechanical model for bagasse in finite element software code. This paper builds on that progress and carries out a further step towards obtaining an adequate material model. In particular, the prediction of volume change during shearing of normally consolidated final bagasse is addressed.
Resumo:
This study seeks to analyse the adequacy of the current regulation of the payday lending industry in Australia, and consider whether there is a need for additional regulation to protect consumers of these services. The report examines the different regulatory approaches adopted in comparable OECD countries, and reviews alternative models for payday regulation, in particular, the role played by responsible lending. The study also examines the consumer protection mechanisms now in existence in Australia in the National Consumer Credit Protection Act 2009 (Cth) (NCCP) and the National Credit Code (NCC) contained in Schedule 1 of that Act and in the Australian Securities and Investments Commission Act 2001 (Cth).
Resumo:
Stimulated human whole saliva (WS) was used to study the dynamics of papain hydrolysis at defined pH, ionic strength and temperature with the view of reducing an acquired pellicle. A quartz crystal microbalance with dissipation (QCM-D) was used to monitor the changes in frequency due to enzyme hydrolysis of WS films and the hydrolytic parameters were calculated using an empirical model. The morphological and conformational changes of the salivary films before and after enzymatic hydrolysis were characterized by atomic force microscopy (AFM) imaging and grazing angle infrared spectroscopy (GA-FTIR) spectra, respectively. The characteristics of papain hydrolysis of WS films were pH-, ionic strength- and temperature-dependent. The WS films were partially removed by the action of enzyme, resulting thinner and smoother surfaces. The IR data suggested that hydrolysis-induced deformation did not occur onto the remnants salivary films. The processes of papain hydrolysis of WS films can be controlled by properly regulating pH, ionic strength and temperature.
Resumo:
‘Was by the Northern Coast’ was an installation at MetroArts in Brisbane. A pile of warped timber, evocative of a dismantled boat, sits in the middle of the gallery space on a bed of carefully-laid bands of polyester insulation and pine battening. From within the wood stack, the sound of dripping water indicates the flow of water created by a silent internal pump. The sound of water intermingles with a soft soundtrack of Kulning, an archaic form of Scandinavian song. In ‘Was by the Northern Coast’, the detritus of timber mimics the Romantic sublime of the mountain peak and nautical wreckage while the snowy drifts of the Northern European landscape become mistranslated as a field of artificial ceiling insulation. In employing such slippages, the work attempted to create the imaginative landscape of an aesthetic displaced by distance and time.
Resumo:
The problem of steady subcritical free surface flow past a submerged inclined step is considered. The asymptotic limit of small Froude number is treated, with particular emphasis on the effect that changing the angle of the step face has on the surface waves. As demonstrated by Chapman & Vanden-Broeck (2006), the divergence of a power series expansion in powers of the square of the Froude number is caused by singularities in the analytic continuation of the free surface; for an inclined step, these singularities may correspond to either the corners or stagnation points of the step, or both, depending on the angle of incline. Stokes lines emanate from these singularities, and exponentially small waves are switched on at the point the Stokes lines intersect with the free surface. Our results suggest that for a certain range of step angles, two wavetrains are switched on, but the exponentially subdominant one is switched on first, leading to an intermediate wavetrain not previously noted. We extend these ideas to the problem of flow over a submerged bump or trench, again with inclined sides. This time there may be two, three or four active Stokes lines, depending on the inclination angles. We demonstrate how to construct a base topography such that wave contributions from separate Stokes lines are of equal magnitude but opposite phase, thus cancelling out. Our asymptotic results are complemented by numerical solutions to the fully nonlinear equations.
Resumo:
Unstructured text data, such as emails, blogs, contracts, academic publications, organizational documents, transcribed interviews, and even tweets, are important sources of data in Information Systems research. Various forms of qualitative analysis of the content of these data exist and have revealed important insights. Yet, to date, these analyses have been hampered by limitations of human coding of large data sets, and by bias due to human interpretation. In this paper, we compare and combine two quantitative analysis techniques to demonstrate the capabilities of computational analysis for content analysis of unstructured text. Specifically, we seek to demonstrate how two quantitative analytic methods, viz., Latent Semantic Analysis and data mining, can aid researchers in revealing core content topic areas in large (or small) data sets, and in visualizing how these concepts evolve, migrate, converge or diverge over time. We exemplify the complementary application of these techniques through an examination of a 25-year sample of abstracts from selected journals in Information Systems, Management, and Accounting disciplines. Through this work, we explore the capabilities of two computational techniques, and show how these techniques can be used to gather insights from a large corpus of unstructured text.