46 resultados para Model driven developments


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Team reflexivity, or the extent to which teams reflect upon and modify their functioning, has attracted much recent research attention. In the current paper, we identify several predictors as well as consequences of reflexivity by reviewing the last decade of literature on team reflexivity. It is observed that team characteristics such as trust and psychological safety among group members, a shared vision, and diversity as well as leadership style of the team’s supervisor influence the level of reflexivity. In addition, team reflexivity is related to a team’s output in terms of innovation, effectiveness, and creativity. Explanations for these effects are discussed and a model including all current findings is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the question of how to communicate among distributed processes valuessuch as real numbers, continuous functions and geometrical solids with arbitrary precision, yet efficiently. We extend the established concept of lazy communication using streams of approximants by introducing explicit queries. We formalise this approach using protocols of a query-answer nature. Such protocols enable processes to provide valid approximations with certain accuracy and focusing on certain locality as demanded by the receiving processes through queries. A lattice-theoretic denotational semantics of channel and process behaviour is developed. Thequery space is modelled as a continuous lattice in which the top element denotes the query demanding all the information, whereas other elements denote queries demanding partial and/or local information. Answers are interpreted as elements of lattices constructed over suitable domains of approximations to the exact objects. An unanswered query is treated as an error anddenoted using the top element. The major novel characteristic of our semantic model is that it reflects the dependency of answerson queries. This enables the definition and analysis of an appropriate concept of convergence rate, by assigning an effort indicator to each query and a measure of information content to eachanswer. Thus we capture not only what function a process computes, but also how a process transforms the convergence rates from its inputs to its outputs. In future work these indicatorscan be used to capture further computational complexity measures. A robust prototype implementation of our model is available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop and study the concept of dataflow process networks as used for exampleby Kahn to suit exact computation over data types related to real numbers, such as continuous functions and geometrical solids. Furthermore, we consider communicating these exact objectsamong processes using protocols of a query-answer nature as introduced in our earlier work. This enables processes to provide valid approximations with certain accuracy and focusing on certainlocality as demanded by the receiving processes through queries. We define domain-theoretical denotational semantics of our networks in two ways: (1) directly, i. e. by viewing the whole network as a composite process and applying the process semantics introduced in our earlier work; and (2) compositionally, i. e. by a fixed-point construction similarto that used by Kahn from the denotational semantics of individual processes in the network. The direct semantics closely corresponds to the operational semantics of the network (i. e. it iscorrect) but very difficult to study for concrete networks. The compositional semantics enablescompositional analysis of concrete networks, assuming it is correct. We prove that the compositional semantics is a safe approximation of the direct semantics. Wealso provide a method that can be used in many cases to establish that the two semantics fully coincide, i. e. safety is not achieved through inactivity or meaningless answers. The results are extended to cover recursively-defined infinite networks as well as nested finitenetworks. A robust prototype implementation of our model is available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have studied the kinetics of the phase-separation process of mixtures of colloid and protein in solutions by real-time UV-vis spectroscopy. Complementary small-angle X-ray scattering (SAXS) was employed to determine the structures involved. The colloids used are gold nanoparticles functionalized with protein resistant oligo(ethylene glycol) (OEG) thiol, HS(CH(2))(11)(OCH(2)CH(2))(6)OMe (EG6OMe). After mixing with protein solution above a critical concentration, c*, SAXS measurements show that a scattering maximum appears after a short induction time at q = 0.0322 angstrom(-1) stop, which increases its intensity with time but the peak position does not change with time, protein concentration and salt addition. The peak corresponds to the distance of the nearest neighbor in the aggregates. The upturn of scattering intensities in the low q-range developed with time indicating the formation of aggregates. No Bragg peaks corresponding to the formation of colloidal crystallites could be observed before the clusters dropped out from the solution. The growth kinetics of aggregates is followed in detail by real-time UV-vis spectroscopy, using the flocculation parameter defined as the integral of the absorption in the range of 600-800 nm wavelengths. At low salt addition (<0.5 M), a kinetic crossover from reaction-limited cluster aggregation (RLCA) to diffusion-limited cluster aggregation (DLCA) growth model is observed, and interpreted as being due to the effective repulsive interaction barrier between colloids within the depletion potential. Above 0.5 M NaCl, the surface charge of proteins is screened significantly, and the repulsive potential barrier disappeared, thus the growth kinetics can be described by a DLCA model only.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major challenge in text mining for biomedicine is automatically extracting protein-protein interactions from the vast amount of biomedical literature. We have constructed an information extraction system based on the Hidden Vector State (HVS) model for protein-protein interactions. The HVS model can be trained using only lightly annotated data whilst simultaneously retaining sufficient ability to capture the hierarchical structure. When applied in extracting protein-protein interactions, we found that it performed better than other established statistical methods and achieved 61.5% in F-score with balanced recall and precision values. Moreover, the statistical nature of the pure data-driven HVS model makes it intrinsically robust and it can be easily adapted to other domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses three important aspects of John Sinclair’s legacy: the corpus, lexicography, and the notion of ‘corpus-driven’. The corpus represents his concern with the nature of linguistic evidence. Lexicography is for him the canonical mode of language description at the lexical level. And his belief that the corpus should ‘drive’ the description is reflected in his constant attempts to utilize the emergent computer technologies to automate the initial stages of analysis and defer the intuitive, interpretative contributions of linguists to increasingly later stages in the process. Sinclair’s model of corpus-driven lexicography has spread far beyond its initial implementation at Cobuild, to most EFL dictionaries, to native-speaker dictionaries (e.g. the New Oxford Dictionary of English, and many national language dictionaries in emerging or re-emerging speech communities) and bilingual dictionaries (e.g. Collins, Oxford-Hachette).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the use of a Model developed by Aston Business School to record the work load of its academic staff. By developing a database to register annual activity in all areas of teaching, administration and research the School has created a flexible tool which can be used for facilitating both day-to-day managerial and longer term strategic decisions. This paper gives a brief outline of the Model and discusses the factors which were taken into account when setting it up. Particular attention is paid to the uses made of the Model and the problems encountered in developing it. The paper concludes with an appraisal of the Model’s impact and of additional developments which are currently being considered. Aston Business School has had a Load Model in some form for many years. The Model has, however, been refined over the past five years, so that it has developed into a form which can be used for a far greater number of purposes within the School. The Model is coordinated by a small group of academic and administrative staff, chaired by the Head of the School. This group is responsible for the annual cycle of collecting and inputting data, validating returns, carrying out analyses of the raw data, and presenting the mater ial to different sections of the School. The authors of this paper are members of this steer ing group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the analogy between lateral convection of heat and the two-phase flow in bubble columns, alternative turbulence modelling methods were analysed. The k-ε turbulence and Reynolds stress models were used to predict the buoyant motion of fluids where a density difference arises due to the introduction of heat or a discrete phase. A large height to width aspect ratio cavity was employed in the transport of heat and it was shown that the Reynolds stress model with the use of velocity profiles including the laminar flow solution resulted in turbulent vortices developing. The turbulence models were then applied to the simulation of gas-liquid flow for a 5:1 height to width aspect ratio bubble column. In the case of a gas superficial velocity of 0.02 m s-1 it was determined that employing the Reynolds stress model yielded the most realistic simulation results. © 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review incorporates strategic planning research conducted over more than 30 years and ranges from the classical model of strategic planning to recent empirical work on intermediate outcomes, such as the reduction of managers’ position bias and the coordination of subunit activity. Prior reviews have not had the benefit of more socialized perspectives that developed in response to Mintzberg’s critique of planning, including research on planned emergence and strategy-as-practice approaches. To stimulate a resurgence of research interest on strategic planning, this review therefore draws on a diverse body of theory beyond the rational design and contingency approaches that characterized research in this domain until the mid-1990s. We develop a broad conceptualization of strategic planning and identify future research opportunities for improving our understanding of how strategic planning influences organizational outcomes. Our framework incorporates the role of strategic planning practitioners; the underlying routines, norms, and procedures of strategic planning (practices); and the concrete activities of planners (praxis).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decades, a number of sustainable strategies and polices have been created to protect and preserve our water environments from the impacts of growing communities. The Australian approach, Water Sensitive Urban Design (WSUD), defined as the integration of urban planning and design with the urban water cycle management, has made considerable advances on design guidelines since 2000. WSUD stormwater management systems (e.g. wetlands, bioretentions, porous pavement etc), also known as Best Management Practices (BMPs) or Low Impact Development (LID), are slowly gaining popularity across Australia, the USA and Europe. There have also been significant improvements in how to model the performance of the WSUD technologies (e.g. MUSIC software). However, the implementation issues of these WSUD practices are mainly related to ongoing institutional capacity. Some of the key problems are associated with a limited awareness of urban planners and designers; in general, they have very little knowledge of these systems and their benefits to the urban environments. At the same time, hydrological engineers should have a better understanding of building codes and master plans. The land use regulations are equally as important as the physical site conditions for determining opportunities and constraints for implementing WSUD techniques. There is a need for procedures that can make a better linkage between urban planners and WSUD engineering practices. Thus, this paper aims to present the development of a general framework for incorporating WSUD technologies into the site planning process. The study was applied to lot-scale in the Melbourne region, Australia. Results show the potential space available for fitting WSUD elements, according to building requirements and different types of housing densities. © 2011 WIT Press.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Animal models of acquired epilepsies aim to provide researchers with tools for use in understanding the processes underlying the acquisition, development and establishment of the disorder. Typically, following a systemic or local insult, vulnerable brain regions undergo a process leading to the development, over time, of spontaneous recurrent seizures. Many such models make use of a period of intense seizure activity or status epilepticus, and this may be associated with high mortality and/or global damage to large areas of the brain. These undesirable elements have driven improvements in the design of chronic epilepsy models, for example the lithium-pilocarpine epileptogenesis model. Here, we present an optimised model of chronic epilepsy that reduces mortality to 1% whilst retaining features of high epileptogenicity and development of spontaneous seizures. Using local field potential recordings from hippocampus in vitro as a probe, we show that the model does not result in significant loss of neuronal network function in area CA3 and, instead, subtle alterations in network dynamics appear during a process of epileptogenesis, which eventually leads to a chronic seizure state. The model’s features of very low mortality and high morbidity in the absence of global neuronal damage offer the chance to explore the processes underlying epileptogenesis in detail, in a population of animals not defined by their resistance to seizures, whilst acknowledging and being driven by the 3Rs (Replacement, Refinement and Reduction of animal use in scientific procedures) principles.