943 resultados para Point Process
Resumo:
The impacts of online collaboration and networking among consumers on social media (SM) websites which are featuring user generated content in a form of product reviews, ratings and recommendations (PRRR) as an emerging information source is the focus of this research. The proliferation of websites where consumers are able to post the PRRR and share them with other consumers has altered the marketing environment in which companies, marketers and advertisers operate. This cross-sectional study explored consumers’ attitudes and behaviour toward various information sources (IS), used in the information search phase of the purchasing decision-making process. The study was conducted among 300 international consumers. The results were showing that personal and public IS were far more reliable than commercial. The findings indicate that traditional marketing tools are no longer viable in the SM milieu.
Resumo:
Objective: To determine the impact of a free-choice diet on nutritional intake and body condition of feral horses. Animals: Cadavers of 41 feral horses from 5 Australian locations. Procedures: Body condition score (BCS) was determined (scale of 1 to 9), and the stomach was removed from horses during postmortem examination. Stomach contents were analyzed for nutritional variables and macroelement and microelement concentrations. Data were compared among the locations and also compared with recommended daily intakes for horses. Results: Mean BCS varied by location; all horses were judged to be moderately thin. The BCS for males was 1 to 3 points higher than that of females. Amount of protein in the stomach contents varied from 4.3% to 14.9% and was significantly associated with BCS. Amounts of water-soluble carbohydrate and ethanol-soluble carbohydrate in stomach contents of feral horses from all 5 locations were higher than those expected for horses eating high-quality forage. Some macroelement and microelement concentrations were grossly excessive, whereas others were grossly deficient. There was no evidence of ill health among the horses. Conclusions and Clinical Relevance: Results suggested that the diet for several populations of feral horses in Australia appeared less than optimal. However, neither low BCS nor trace mineral deficiency appeared to affect survival of the horses. Additional studies on food sources in these regions, including analysis of water-soluble carbohydrate, ethanol-soluble carbohydrate, and mineral concentrations, are warranted to determine the provenance of such rich sources of nutrients. Determination of the optimal diet for horses may need revision.
Resumo:
The term Design Led Innovation is emerging as a fundamental business process, which is rapidly being adopted by large as well as small to medium sized firms. The value that design brings to an organisation is a different way of thinking, of framing situations and possibilities, doing things and tackling problems: essentially a cultural transformation of the way the firm undertakes its business. Being Design Led is increasingly being seen by business as a driver of company growth, allowing firms to provide a strong point of difference to its stakeholders. Achieving this Design Led process, requires strong leadership to enable the organisation to develop a clear vision for top line growth. Specifically, based on deep customer insights and expanded through customer and stakeholder engagements, the outcomes of which are then adopted by all aspects of the business. To achieve this goal, several tools and processes are available, which need to be linked to new organisational capabilities within a business transformation context. The Design Led Innovation Team focuses on embedding tools and processes within an organisation and matching this with design leadership qualities to enable companies to create breakthrough innovation and achieve sustained growth, through ultimately transforming their business model. As all information for these case studies was derived from publicly accessed data, this resource is not intended to be used as reference material, but rather is a learning tool for designers to begin to consider and explore businesses at a strategic level. It is not the results that are key, but rather the process and philosophies that were used to create these case studies and disseminate this way of thinking amongst the design community. It is this process of unpacking a business guided by the framework of Osterwalder’s Business Model Canvas* which provides an important tool for designers to gain a greater perspective of a company’s true innovation potential.
Resumo:
This paper presents the idea of a compendium of process technologies, i.e., a concise but comprehensive collection of techniques for process model analysis that support research on the design, execution, and evaluation of processes. The idea originated from observations on the evolution of process-related research disciplines. Based on these observations, we derive design goals for a compendium. Then, we present the jBPT library, which addresses these goals by means of an implementation of common analysis techniques in an open source codebase.
Resumo:
We describe a pedagogical approach that addresses challenges in design education for novices. These include an inability to frame new problems and limited-to-no design capability or domain knowledge. Such challenges can reduce student engagement with design practice, cause derivative design solutions as well as the inappropriate simplification of design assignments and assessment criteria by educators. We argue that a curriculum that develops the student’s design process will enable them to deal with the uncertain and dynamic situations that characterise design. We describe how this may be achieved and explain our pedagogical approach in terms of methods from Reflective Practice and theories of abstraction and creativity. We present a landscape architecture unit, recently taught, as an example. It constitutes design exercises that require little domain or design expertise to support the development of conceptual thinking and a design rationale. We show how this approach (a) leveraged the novice’s existing spatial and thinking skills while (b) retaining contextually-rich design situations. Examples of the design exercises taught are described along with samples of student work. The assessment rationale is also presented and explained. Finally, we conclude by reflecting on how this approach relates to innovation, sustainability and other disciplines.
Resumo:
The project investigated the relationships between diversification in modes ofdelivery, use of information and communication technologies, academics’ teaching practices, and the context in which those practices are employed, in two of the three large universities in Brisbane—Griffith University and the Queensland University of Technology (QUT). The project’s initial plan involved the investigation of two sites: Queensland University of Technology’s Faculty of Education (Kelvin Grove campus) and Griffith University’s Faculty of Humanities(Nathan campus). Interviews associated with the Faculty of Education led to a decision to include a third site—the School of Law within Queensland University of Technology’s Faculty of Law, which is based on the Gardens Point Campus. Here the investigation focused on the use of computer-based flexible learning practices, as distinct from the more text-based practices identified within the original two sites.
Resumo:
Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.
Resumo:
This work experimentally examines the performance benefits of a regional CORS network to the GPS orbit and clock solutions for supporting real-time Precise Point Positioning (PPP). The regionally enhanced GPS precise orbit solutions are derived from a global evenly distributed CORS network added with a densely distributed network in Australia and New Zealand. A series of computational schemes for different network configurations are adopted in the GAMIT-GLOBK and PANDA data processing. The precise GPS orbit results show that the regionally enhanced solutions achieve the overall orbit improvements with respect to the solutions derived from the global network only. Additionally, the orbital differences over GPS satellite arcs that are visible by any of the five Australia-wide CORS stations show a higher percentage of overall improvements compared to the satellite arcs that are not visible from these stations. The regional GPS clock and Uncalibrated Phase Delay (UPD) products are derived using the PANDA real time processing module from Australian CORS networks of 35 and 79 stations respectively. Analysis of PANDA kinematic PPP and kinematic PPP-AR solutions show certain overall improvements in the positioning performance from a denser network configuration after solution convergence. However, the clock and UPD enhancement on kinematic PPP solutions is marginal. It is suggested that other factors, such as effects of ionosphere, incorrectly fixed ambiguities, may be the more dominating, deserving further research attentions.
Resumo:
Growing community concerns about the ecological, social, cultural and economic impact of housing and urban projects poses new challenges for those who have to deliver them. It is important that these concerns are addressed as part of the community engagement processes on projects. Community engagement is traditionally perceived as the purview of planners and disconnected from the building construction process. This is despite most project approval processes mandating on-going community engagement over the project’s entire lifetime. There is evidence that point to a culture of ambiguity and ambivalence among building professionals about their roles, responsibilities and expectations of community engagement during the construction phase of projects. This has contributed to a culture of distrust between communities and the construction industry. There is a clear need to build capacity among building professionals to empower them as active participants in community engagement processes which can promote better project outcomes and minimise delays and conflicts. This paper describes a process that utilises the Theory of Planned Behaviour as a framework to equip building professionals with the skills they need to engage effectively with local communities during the construction phase of projects.
Resumo:
The aim of this work is to develop software that is capable of back projecting primary fluence images obtained from EPID measurements through phantom and patient geometries in order to calculate 3D dose distributions. In the first instance, we aim to develop a tool for pretreatment verification in IMRT. In our approach, a Geant4 application is used to back project primary fluence values from each EPID pixel towards the source. Each beam is considered to be polyenergetic, with a spectrum obtained from Monte Carlo calculations for the LINAC in question. At each step of the ray tracing process, the energy differential fluence is corrected for attenuation and beam divergence. Subsequently, the TERMA is calculated and accumulated to an energy differential 3D TERMA distribution. This distribution is then convolved with monoenergetic point spread kernels, thus generating energy differential 3D dose distributions. The resulting dose distributions are accumulated to yield the total dose distribution, which can then be used for pre-treatment verification of IMRT plans. Preliminary results were obtained for a test EPID image comprised of 100 9 100 pixels of unity fluence. Back projection of this field into a 30 cm9 30 cm 9 30 cm water phantom was performed, with TERMA distributions obtained in approximately 10 min (running on a single core of a 3 GHz processor). Point spread kernels for monoenergetic photons in water were calculated using a separate Geant4 application. Following convolution and summation, the resulting 3D dose distribution produced familiar build-up and penumbral features. In order to validate the dose model we will use EPID images recorded without any attenuating material in the beam for a number of MLC defined square fields. The dose distributions in water will be calculated and compared to TPS predictions.
Resumo:
Dose kernels may be used to calculate dose distributions in radiotherapy (as described by Ahnesjo et al., 1999). Their calculation requires use of Monte Carlo methods, usually by forcing interactions to occur at a point. The Geant4 Monte Carlo toolkit provides a capability to force interactions to occur in a particular volume. We have modified this capability and created a Geant4 application to calculate dose kernels in cartesian, cylindrical, and spherical scoring systems. The simulation considers monoenergetic photons incident at the origin of a 3 m x 3 x 9 3 m water volume. Photons interact via compton, photo-electric, pair production, and rayleigh scattering. By default, Geant4 models photon interactions by sampling a physical interaction length (PIL) for each process. The process returning the smallest PIL is then considered to occur. In order to force the interaction to occur within a given length, L_FIL, we scale each PIL according to the formula: PIL_forced = L_FIL 9 (1 - exp(-PIL/PILo)) where PILo is a constant. This ensures that the process occurs within L_FIL, whilst correctly modelling the relative probability of each process. Dose kernels were produced for an incident photon energy of 0.1, 1.0, and 10.0 MeV. In order to benchmark the code, dose kernels were also calculated using the EGSnrc Edknrc user code. Identical scoring systems were used; namely, the collapsed cone approach of the Edknrc code. Relative dose difference images were then produced. Preliminary results demonstrate the ability of the Geant4 application to reproduce the shape of the dose kernels; median relative dose differences of 12.6, 5.75, and 12.6 % were found for an incident photon energy of 0.1, 1.0, and 10.0 MeV respectively.
Resumo:
We examine which capabilities technologies provide to support collaborative process modeling. We develop a model that explains how technology capabilities impact cognitive group processes, and how they lead to improved modeling outcomes and positive technology beliefs. We test this model through a free simulation experiment of collaborative process modelers structured around a set of modeling tasks. With our study, we provide an understanding of the process of collaborative process modeling, and detail implications for research and guidelines for the practical design of collaborative process modeling.
Resumo:
Hepatitis C, which was first identified in 1988, has become an important issue for public health as epidemiological and clinical evidence has emerged. These disciplines have highlighted the extent of infection and its medical consequences. Now, governments at both the state and federal levels are sifting through this evidence and are attempting to create structures to deal with the problem of hepatitis C. These structures have generally taken the form of expert committees and working parties organised from established medical, scientific and public health bodies...
Resumo:
Desalination processes to remove dissolved salts from seawater or brackish water includes common industrial scale processes such as reverse osmosis, thermal processes (i.e. multi-stage flash, multiple-effect distillation) and mechanical vapour compression. These processes are very energy intensive. The Institute for Future Environments (IFE) has evaluated various alternative processes to accomplish desalination using renewable or sustainable energy sources. A new process - a solar, thermally driven distillation system . based on the principles of a solar still – has been examined. This work presents an initial evaluation of the process.
Resumo:
A solar thermal membrane distillation pilot plant was operated for over 70 days in field conditions. The pilot plant incorporated a single spiral wound permeate gap membrane distillation style of module. All energy used to operate the unit was supplied by solar hot water collectors and photovoltaic panels. The process was able to produce a distillate stream of product water with a conductivity less than 10 µS/cm. Feed water concentration varied from 2,400 µS/cm to 106,000 µS/cm. The process is expected to find application in the production of drinking water for remote island and arid regions without the consumption of electrical energy.