261 resultados para Packing dimension


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In cases involving allegations of price fixing under the former s 45A of the Trade Practices Act 1974 (Cth), it was necessary to prove that at least two parties to the arrangement or understanding at issue were “in competition with each other”. The same requirement is contained in the cartel provisions of the Competition and Consumer Act 2010 (Cth) (CCA) that replaced s 45A. The so-called “competition condition” is set out in s 44ZZRD (4) of the CCA. Where a supplier enters into vertical supply arrangements with agents or brokers, problems can arise if the supplier also has a downstream presence. At that functional level there may be a horizontal and therefore competitive dimension, and the competition condition may be satisfied. In such circumstances, great care will need to be taken in any discussions between the supplier and its downstream agents or distributors about the prices, discounts, allowances, rebates or credits that the agent or distributor may charge. Whether agents or brokers competed with their suppliers in vertical supply arrangements arose for consideration in two decisions handed down by the Federal Court in Brisbane...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this paper is to explore the relationship between dynamic capabilities and different types of online innovations. Building on qualitative data from the publishing industry, our analysis revealed that companies that had relatively strong dynamic capabilities in all three areas (sensing, seizing and reconfiguration) seem to produce innovations that combine their existing capabilities on either the market or the technology dimension with new capabilities on the other dimension thus resulting in niche creation and revolutionary type innovations. Correspondingly, companies with a weaker or more one-sided set of dynamic capabilities seem to produce more radical innovations requiring both new market and technological capabilities. The study therefore provides an empirical contribution to the emerging work on dynamic capabilities through its in-depth investigation of the capabilities of the four case firms, and by mapping the patterns between the firm's portfolio of dynamic capabilities and innovation outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

My thesis is an exploration of the architectural production surrounding the French philosopher Gilles Deleuze, specifically, through the overarching theme of Deleuze’s theory of subjectivity, which I will call subjectivization. I interpret this to mean the strange coalescence of matter, architectural subject, and event, in architectural experience and culture. I speculate that subjectivization presents a yet under-explored dimension of deleuzianism in architecture. In order to develop this I pursue two independent trajectories: firstly the narrative of architectural production surrounding Deleuze, from the 1970s until today, as it is an emergence of changing groupings, alliances, formations and disbandment in the pursuit of creative-intellectual tasks—what might be called the subjectivization of architecture—and, secondly, through a speculation about the architecture of subjectivization—that is, an attempt to explore, concretely, what might be the space and time of subjectivization. Chapter One traces an oral history of deleuzianism in architecture, through conversations with Sanford Kwinter and John Rajchman, describing how the Deleuze milieu makes its way into architectural practice and discussion—subjectivization as a social and cultural emergence—whereas Chapter Two theorizes the emergence of an architectural subjectivity where architecture constitutes its own affective event—what I call subjectivization or material becoming-subject.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The numerical solution in one space dimension of advection--reaction--diffusion systems with nonlinear source terms may invoke a high computational cost when the presently available methods are used. Numerous examples of finite volume schemes with high order spatial discretisations together with various techniques for the approximation of the advection term can be found in the literature. Almost all such techniques result in a nonlinear system of equations as a consequence of the finite volume discretisation especially when there are nonlinear source terms in the associated partial differential equation models. This work introduces a new technique that avoids having such nonlinear systems of equations generated by the spatial discretisation process when nonlinear source terms in the model equations can be expanded in positive powers of the dependent function of interest. The basis of this method is a new linearisation technique for the temporal integration of the nonlinear source terms as a supplementation of a more typical finite volume method. The resulting linear system of equations is shown to be both accurate and significantly faster than methods that necessitate the use of solvers for nonlinear system of equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: To establish injury rates among a population of elite athletes, to provide normative data for psychological variables hypothesised to be predictive of sport injuries, and to establish relations between measures of mood, perceived life stress, and injury characteristics as a precursor to introducing a psychological intervention to ameliorate the injury problem. Methods: As part of annual screening procedures, athletes at the Queensland Academy of Sport report medical and psychological status. Data from 845 screenings (433 female and 412 male athletes) were reviewed. Population specific tables of normative data were established for the Brunel mood scale and the perceived stress scale. Results: About 67% of athletes were injured each year, and about 18% were injured at the time of screening. Fifty percent of variance in stress scores could be predicted from mood scores, especially for vigour, depression, and tension. Mood and stress scores collectively had significant utility in predicting injury characteristics. Injury status (current, healed, no injury) was correctly classified with 39% accuracy, and back pain with 48% accuracy. Among a subset of 233 uninjured athletes (116 female and 117 male), five mood dimensions (anger, confusion, fatigue, tension, depression) were significantly related to orthopaedic incidents over the preceding 12 months, with each mood dimension explaining 6–7% of the variance. No sex differences in these relations were found. Conclusions: The findings support suggestions that psychological measures have utility in predicting athletic injury, although the relatively modest explained variance highlights the need to also include underlying physiological indicators of allostatic load, such as stress hormones, in predictive models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important responsibility of the Environment Protection Authority, Victoria, is to set objectives for levels of environmental contaminants. To support the development of environmental objectives for water quality, a need has been identified to understand the dual impacts of concentration and duration of a contaminant on biota in freshwater streams. For suspended solids contamination, information reported by Newcombe and Jensen [ North American Journal of Fisheries Management , 16(4):693--727, 1996] study of freshwater fish and the daily suspended solids data from the United States Geological Survey stream monitoring network is utilised. The study group was requested to examine both the utility of the Newcombe and Jensen and the USA data, as well as the formulation of a procedure for use by the Environment Protection Authority Victoria that takes concentration and duration of harmful episodes into account when assessing water quality. The extent to which the impact of a toxic event on fish health could be modelled deterministically was also considered. It was found that concentration and exposure duration were the main compounding factors on the severity of effects of suspended solids on freshwater fish. A protocol for assessing the cumulative effect on fish health and a simple deterministic model, based on the biology of gill harm and recovery, was proposed. References D. W. T. Au, C. A. Pollino, R. S. S Wu, P. K. S. Shin, S. T. F. Lau, and J. Y. M. Tang. Chronic effects of suspended solids on gill structure, osmoregulation, growth, and triiodothyronine in juvenile green grouper epinephelus coioides . Marine Ecology Press Series , 266:255--264, 2004. J.C. Bezdek, S.K. Chuah, and D. Leep. Generalized k-nearest neighbor rules. Fuzzy Sets and Systems , 18:237--26, 1986. E. T. Champagne, K. L. Bett-Garber, A. M. McClung, and C. Bergman. {Sensory characteristics of diverse rice cultivars as influenced by genetic and environmental factors}. Cereal Chem. , {81}:{237--243}, {2004}. S. G. Cheung and P. K. S. Shin. Size effects of suspended particles on gill damage in green-lipped mussel perna viridis. Marine Pollution Bulletin , 51(8--12):801--810, 2005. D. H. Evans. The fish gill: site of action and model for toxic effects of environmental pollutants. Environmental Health Perspectives , 71:44--58, 1987. G. C. Grigg. The failure of oxygen transport in a fish at low levels of ambient oxygen. Comp. Biochem. Physiol. , 29:1253--1257, 1969. G. Holmes, A. Donkin, and I.H. Witten. {Weka: A machine learning workbench}. In Proceedings of the Second Australia and New Zealand Conference on Intelligent Information Systems , volume {24}, pages {357--361}, {Brisbane, Australia}, {1994}. {IEEE Computer Society}. D. D. Macdonald and C. P. Newcombe. Utility of the stress index for predicting suspended sediment effects: response to comments. North American Journal of Fisheries Management , 13:873--876, 1993. C. P. Newcombe. Suspended sediment in aquatic ecosystems: ill effects as a function of concentration and duration of exposure. Technical report, British Columbia Ministry of Environment, Lands and Parks, Habitat Protection branch, Victoria, 1994. C. P. Newcombe and J. O. T. Jensen. Channel suspended sediment and fisheries: A synthesis for quantitative assessment of risk and impact. North American Journal of Fisheries Management , 16(4):693--727, 1996. C. P. Newcombe and D. D. Macdonald. Effects of suspended sediments on aquatic ecosystems. North American Journal of Fisheries Management , 11(1):72--82, 1991. K. Schmidt-Nielsen. Scaling. Why is animal size so important? Cambridge University Press, NY, 1984. J. S. Schwartz, A. Simon, and L. Klimetz. Use of fish functional traits to associate in-stream suspended sediment transport metrics with biological impairment. Environmental Monitoring and Assessment , 179(1--4):347--369, 2011. E. Al Shaw and J. S. Richardson. Direct and indirect effects of sediment pulse duration on stream invertebrate assemb ages and rainbow trout ( Oncorhynchus mykiss ) growth and survival. Canadian Journal of Fish and Aquatic Science , 58:2213--2221, 2001. P. Tiwari and H. Hasegawa. {Demand for housing in Tokyo: A discrete choice analysis}. Regional Studies , {38}:{27--42}, {2004}. Y. Tramblay, A. Saint-Hilaire, T. B. M. J. Ouarda, F. Moatar, and B Hecht. Estimation of local extreme suspended sediment concentrations in california rivers. Science of the Total Environment , 408:4221--

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generalization of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics. Also, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement. The comparison results show that the computation using our mapper/reducer placement is much cheaper while still satisfying the computation deadline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NPcomplete. Thus, in this paper we propose a new grouping genetic algorithm for the mappers/reducers placement problem in cloud computing. Compared with the original one, our grouping genetic algorithm uses an innovative coding scheme and also eliminates the inversion operator which is an essential operator in the original grouping genetic algorithm. The new grouping genetic algorithm is evaluated by experiments and the experimental results show that it is much more efficient than four popular algorithms for the problem, including the original grouping genetic algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design-build (DB) system is regarded as an effective means of delivering sustainable buildings. Specifying clear sustainability requirements to potential contractors is of great importance to project success. This research investigates the current state-of-the-practice for the definition of sustainability requirements within the public sectors of the U.S. construction market using a robust content analysis of 49 DB requests for proposals (RFPs). The results reveal that owners predominantly communicate their desired level of sustainability through the LEED certification system. The sustainability requirement has become an important dimension for the best-value evaluation of DB contractors with specific importance weightings of up to 25%. Additionally, owners of larger projects and who provide less design information in their RFPs generally allocate significantly higher importance weightings to sustainability requirements. The primary knowledge contribution of this study to the construction industry is the reveal of current trend in DB procurement for green projects. The findings also provide owners, architects, engineers, and constructors with an effective means of communicating sustainability objectives in solicitation documents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The processing of juice expressed from whole green sugarcane crop (stalk and trash) leads to poor clarification performance, reduced sugar yield and poor raw sugar quality. The cause of these adverse effects is linked to the disproportionate contribution of impurities from the trash component of the crop. This paper reports on the zeta (ζ) potential, average size distribution (d50) and fractal dimension (Df) of limed juice particles derived from various juice types using laser diffraction and dynamic light scattering techniques. The influence of non-sucrose impurities on the interactive energy contributions between sugarcane juice particles was examined on the basis of Derjaguin-Landau-Verwey-Overbeek (DLVO) theory. Results from these investigations have provided evidence (in terms of particle stability) on why juice particles derived from whole green sugarcane crop are relatively difficult to coagulate (and flocculate). The presence of trash reduces the van der Waals forces of attraction between particles, thereby reducing coagulation and flocculation processes. It is anticipated that further fundamental work will lead to strategies that could be adopted for clarifying juices expressed from whole green sugarcane crop.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The processing of juice expressed from whole green sugarcane crop (stalk and trash) leads to poor clarification performance, reduced sugar yield and poor raw sugar quality. The cause of these adverse effects is linked to the disproportionate contribution of impurities from the trash component of the crop. This paper reports on the zeta (?) potential, average size distribution (d50) and fractal dimension (Df) of limed juice particles derived from various juice types using laser diffraction and dynamic light scattering techniques. The influence of non-sucrose impurities on the interactive energy contributions between sugarcane juice particles was examined on the basis of Derjaguin-Landau-Verwey-Overbeek (DLVO) theory. Results from these investigations have provided evidence (in terms of particle stability) on why juice particles derived from whole green sugarcane crop are relatively difficult to coagulate (and flocculate). The presence of trash reduces the van der Waals forces of attraction between particles, thereby reducing coagulation and flocculation processes. It is anticipated that further fundamental work will lead to strategies that could be adopted for clarifying juices expressed from whole green sugarcane crop.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we consider the problem of position regulation of a class of underactuated rigid-body vehicles that operate within a gravitational field and have fully-actuated attitude. The control objective is to regulate the vehicle position to a manifold of dimension equal to the underactuation degree. We address the problem using Port-Hamiltonian theory, and reduce the associated matching PDEs to a set of algebraic equations using a kinematic identity. The resulting method for control design is constructive. The point within the manifold to which the position is regulated is determined by the action of the potential field and the geometry of the manifold. We illustrate the performance of the controller for an unmanned aerial vehicle with underactuation degree two-a quadrotor helicopter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study investigated the impact of teachers' organizational citizenship behaviours (OCBs) on student quality of school life (SQSL) via the indirect effect of job efficacy. A measure of teacher OCBs was developed, tapping one dimension of individual-focused OCB (OCBI – student-directed behaviour) and two dimensions of organization-focused OCB (OCBO – civic virtue and professional development). In line with previous research suggesting that OCBs may enhance job efficacy, as well as studies demonstrating the positive effects of teacher efficacy on student outcomes, we expected an indirect relationship between teachers OCBs and SQSL via teachers' job efficacy. Hypotheses were tested in a multi-level design in which 170 teachers and their students (N=3,057) completed questionnaires. A significant proportion of variance in SQSL was attributable to classroom factors. Analyses revealed that the civic virtue and professional development behaviours of teachers were positively related to their job efficacy. The job efficacy of teachers also had a positive impact on all five indicators of SQSL. In regards to professional development, job efficacy acted as an indirect variable in the prediction of four student outcomes (i.e., general satisfaction, student–teacher relations, achievement, and opportunity) and fully mediated the direct negative effect on psychological distress.