34 resultados para Design-based research
em CentAUR: Central Archive University of Reading - UK
Resumo:
In participatory design situations the competence of the facilitator will influence the opportunities for a user group to become engaged in the process of design. Based on the observation of the conversations from a series of design workshops, the performance of design facilitation expertise by an expert architect is compared with a less experienced architectural graduate. The skills that are the focus of this research are the conversational competences deployed by architects to engage users in the design of an architectural project. The difference between the conversational behaviour of a project architect and a less experienced graduate was observed to illustrate with examples the effect the performance of facilitation had on the opportunity for user engagement in design, and of learning the skill of facilitation that occurred in these situations.
Resumo:
This report describes the concept for a clinical trial that uses carbamazepine as the gold-standard active control for a study of newly diagnosed patients. The authors describe an endpoint including efficacy and tolerability, and a stopping rule that uses a series of interim analyses in order to reach a conclusion as efficiently as possible without sacrificing reliability.
Resumo:
Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.
Resumo:
Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.
Resumo:
Recent interest in material objects - the things of everyday interaction - has led to articulations of their role in the literature on organizational knowledge and learning. What is missing is a sense of how the use of these 'things' is patterned across both industrial settings and time. This research addresses this gap with a particular emphasis on visual materials. Practices are analysed in two contrasting design settings: a capital goods manufacturer and an architectural firm. Materials are observed to be treated both as frozen, and hence unavailable for change; and as fluid, open and dynamic. In each setting temporal patterns of unfreezing and refreezing are associated with the different types of materials used. The research suggests that these differing patterns or rhythms of visual practice are important in the evolution of knowledge and in structuring social relations for delivery. Hence, to improve their performance practitioners should not only consider the types of media they use, but also reflect on the pace and style of their interactions.
Resumo:
Driven by new network and middleware technologies such as mobile broadband, near-field communication, and context awareness the so-called ambient lifestyle will foster innovative use cases in building automation, healthcare and agriculture. In the EU project Hydra1 highlevel security, trust and privacy concerns such as loss of control, profiling and surveillance are considered at the outset. At the end of this project the Hydra middleware development platform will have been designed so as to enable developers to realise secure ambient scenarios especially in the user domains of building automation, healthcare, and agriculture. This paper gives a short introduction to the Hydra project, its user domains and its approach to ensure security by design. Based on the results of a focus group analysis of the building automation domain typical threats are evaluated and their risks are assessed. Then, specific security requirements with respect to security, privacy, and trust are derived in order to incorporate them into the Hydra Security Meta Model. How concepts such as context security, semantic security, and virtualisation support the overall Hydra approach will be introduced and illustrated on the basis of a technical building automation scenario.
Resumo:
Ethnographic methodologies developed in social anthropology and sociology hold considerable promise for addressing practical, problem-based research concerned with the construction site. The extended researcher-engagement characteristic of ethnography reveals rich insights, yet is infrequently used to understand how workplace realities are lived out on construction sites. Moreover, studies that do employ these methods are rarely reported within construction research journals. This paper argues that recent innovations in ethnographic methodologies offer new routes to: posing questions; understanding workplace socialities (i.e. the qualities of the social relationships that develop on construction sites); learning about forms, uses and communication of knowledge on construction sites; and turning these into meaningful recommendations. This argument is supported by examples from an interdisciplinary ethnography concerning migrant workers and communications on UK construction sites. The presented research seeks to understand how construction workers communicate with managers and each other and how they stay safe on site, with the objective of informing site health-and-safety strategies and the production and evaluation of training and other materials.
Resumo:
1. Closed Ecological Systems (CES) are small manmade ecosystems which do not have any material exchange with the surrounding environment. Recent ecological and technological advances enable successful establishment and maintenance of CES, making them a suitable tool for detecting and measuring subtle feedbacks and mechanisms. 2. As a part of an analogue (physical) C cycle modelling experiment, we developed a non-intrusive methodology to control the internal environment and to monitor atmospheric CO2 concentration inside 16 replicated CES. Whilst maintaining an air-tight seal of all CES, this approach allowed for access to the CO2 measuring equipment for periodic re-calibration and repairs. 3. To ensure reliable cross-comparison of CO2 observations between individual CES units and to minimise the cost of the system, only one CO2 sampling unit was used. An ADC BioScientific OP-2 (open-path) analyser mounted on a swinging arm was passing over a set of 16 measuring cells. Each cell was connected to an individual CES with air continuously circulating between them. 4. Using this setup, we were able to continuously measure several environmental variables and CO2 concentration within each closed system, allowing us to study minute effects of changing temperature on C fluxes within each CES. The CES and the measuring cells showed minimal air leakage during an experimental run lasting, on average, 3 months. The CO2 analyser assembly performed reliably for over 2 years, however an early iteration of the present design proved to be sensitive to positioning errors. 5. We indicate how the methodology can be further improved and suggest possible avenues where future CES based research could be applied.
Resumo:
A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.
Resumo:
PURPOSE: Since its introduction in 2006, messages posted to the microblogging system Twitter have provided a rich dataset for researchers, leading to the publication of over a thousand academic papers. This paper aims to identify this published work and to classify it in order to understand Twitter based research. DESIGN/METHODOLOGY/APPROACH: Firstly the papers on Twitter were identified. Secondly, following a review of the literature, a classification of the dimensions of microblogging research was established. Thirdly, papers were qualitatively classified using open coded content analysis, based on the paper’s title and abstract, in order to analyze method, subject, and approach. FINDINGS: The majority of published work relating to Twitter concentrates on aspects of the messages sent and details of the users. A variety of methodological approaches are used across a range of identified domains. RESEARCH LIMITATIONS/IMPLICATIONS: This work reviewed the abstracts of all papers available via database search on the term “Twitter” and this has two major implications: 1) the full papers are not considered and so works may be misclassified if their abstract is not clear, 2) publications not indexed by the databases, such as book chapters, are not included. ORIGINALITY/VALUE: To date there has not been an overarching study to look at the methods and purpose of those using Twitter as a research subject. Our major contribution is to scope out papers published on Twitter until the close of 2011. The classification derived here will provide a framework within which researchers studying Twitter related topics will be able to position and ground their work
Resumo:
Sampling strategies for monitoring the status and trends in wildlife populations are often determined before the first survey is undertaken. However, there may be little information about the distribution of the population and so the sample design may be inefficient. Through time, as data are collected, more information about the distribution of animals in the survey region is obtained but it can be difficult to incorporate this information in the survey design. This paper introduces a framework for monitoring motile wildlife populations within which the design of future surveys can be adapted using data from past surveys whilst ensuring consistency in design-based estimates of status and trends through time. In each survey, part of the sample is selected from the previous survey sample using simple random sampling. The rest is selected with inclusion probability proportional to predicted abundance. Abundance is predicted using a model constructed from previous survey data and covariates for the whole survey region. Unbiased design-based estimators of status and trends and their variances are derived from two-phase sampling theory. Simulations over the short and long-term indicate that in general more precise estimates of status and trends are obtained using this mixed strategy than a strategy in which all of the sample is retained or all selected with probability proportional to predicted abundance. Furthermore the mixed strategy is robust to poor predictions of abundance. Estimates of status are more precise than those obtained from a rotating panel design.
Resumo:
The objective was to measure effects of 3-nitrooxypropanol (3NP) on methane production of lactating dairy cows and any associated changes in digestion and energy and nitrogen metabolism. Six Holstein-Friesian dairy cows in mid-lactation were fed twice daily a total mixed ration with maize silage as the primary forage source. Cows received 1 of 3 treatments using an experimental design based on two 3 × 3 Latin squares with 5-wk periods. Treatments were a control placebo or 500 or 2,500 mg/d of 3NP delivered directly into the rumen, via the rumen fistula, in equal doses before each feeding. Measurements of methane production and energy and nitrogen balance were obtained during wk 5 of each period using respiration calorimeters and digestion trials. Measurements of rumen pH (48 h) and postprandial volatile fatty acid and ammonia concentrations were made at the end of wk 4. Daily methane production was reduced by 3NP, but the effects were not dose dependent (reductions of 6.6 and 9.8% for 500 and 2,500 mg/d, respectively). Dosing 3NP had a transitory inhibitory effect on methane production, which may have been due to the product leaving the rumen in liquid outflow or through absorption or metabolism. Changes in rumen concentrations of volatile fatty acids indicated that the pattern of rumen fermentation was affected by both doses of the product, with a decrease in acetate:propionate ratio observed, but that acetate production was inhibited by the higher dose. Dry matter, organic matter, acid detergent fiber, N, and energy digestibility were reduced at the higher dose of the product. The decrease in digestible energy supply was not completely countered by the decrease in methane excretion such that metabolizable energy supply, metabolizable energy concentration of the diet, and net energy balance (milk plus tissue energy) were reduced by the highest dose of 3NP. Similarly, the decrease in nitrogen digestibility at the higher dose of the product was associated with a decrease in body nitrogen balance that was not observed for the lower dose. Milk yield and milk fat concentration and fatty acid composition were not affected but milk protein concentration was greater for the higher dose of 3NP. Twice-daily rumen dosing of 3NP reduced methane production by lactating dairy cows, but the dose of 2,500 mg/d reduced rumen acetate concentration, diet digestibility, and energy supply. Further research is warranted to determine the optimal dose and delivery method of the product. Key words: 3-nitrooxypropanol, methane, digestion, rumen, dairy cow
Resumo:
Purpose – This paper seeks to make the case for new research into the perceived fairness and impact of executive pay. Design/methodology/approach – The paper reviews the literature regarding executive compensation and corporate performance and examines the evidence that a more egalitarian approach to pay could be justified in terms of long-term shareholder value. Findings – There would appear to be no evidence to suggest that the growing gap between the pay of executives and that of the average employee generates long-term enterprise value, and it may even be detrimental to firms, if not the liberal capitalist consensus on which the corporate licence to operate is based. Research limitations/implications – The paper outlines a new approach to tracking income differentials with corporate performance through the development of a corporate Gini coefficient “league table”. Social implications – The proposed research is expected to point towards better practice in executive remuneration, and support the growing momentum for a sustainable and enlightened approach to business, in which the key goal is long-term enterprise value based on a fair distribution of the rewards of business. Originality/value – In producing a deeper understanding of the impact of widening income differentials, the paper should be of interest to senior executives in publicly quoted companies as well as press commentators, government officials and academics.