983 resultados para improving standards
Resumo:
This paper is a concise explanation of the normative background to strength grading in Europe, addressing important aspects that are commonly misunderstood by structural engineers and timber researchers. It also highlights changes that are being made to the standards to: incorporate requirements of the construction products regulations; add improvements to the system to accommodate the latest knowledge and technology; and widen the application of the standards. Where designs need to be optimised, there is an opportunity to use the system more intelligently, in combination with the latest technology, to better fit design values to the true properties of the timber resource. This can bring a design enhancement equivalent to effort improving other aspects of the structure, such as connectors and reinforcement. Parallel to this, researchers working on other aspects of structural improvement need to understand what grades really mean in respect of the properties of the timber, in order to correctly analyse the results of testing. It is also useful to know how techniques used in grading can assist with material properties characterisation for research. The amount of destructive testing involved in establishing machine grading settings and visual grading assignments presents a barrier to greater use of local timber, and diversification of commercial species, so it is important that any researcher assessing the properties of such species should consider, from the outset, doing the research in a way that can contribute to a grading dataset at a later date. This paper provides an overview of what is required for this.
Resumo:
No data (2013)
Resumo:
With increasing international mobility, higher education must cater to the varying linguistic and cultural needs of students. Successful delivery of courses through English as the vehicular language is essential to encourage international enrollment. However, this cannot be achieved without preparing university professors in the many intricacies delivering their subjects in English may pose. This paper aims to: share preliminary data concerning Content and Language Integrated Learning (CLIL) at Laureate Network Universities worldwide as few studies have been conducted at the tertiary level, reflect upon data regarding student and teacher satisfaction with CLIL at the Universidad Europea de Madrid (UEM), and to propose improvements in English-taught subjects.
Resumo:
Yeoman, A., Durbin, J. & Urquhart, C. (2004). Evaluating SWICE-R (South West Information for Clinical Effectiveness - Rural). Final report for South West Workforce Development Confederations, (Knowledge Resources Development Unit). Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: South West WDCs (NHS)
Resumo:
Liu, Yonghuai. Improving ICP with Easy Implementation for Free Form Surface Matching. Pattern Recognition, vol. 37, no. 2, pp. 211-226, 2004.
Resumo:
Dissertação apresentada à Universidade Fernando Pessoa como parte dos requisitos para a obtenção do grau de Mestre em Ciências da Comunicação, ramo de Marketing e Publicidade
Resumo:
We consider the problem of task assignment in a distributed system (such as a distributed Web server) in which task sizes are drawn from a heavy-tailed distribution. Many task assignment algorithms are based on the heuristic that balancing the load at the server hosts will result in optimal performance. We show this conventional wisdom is less true when the task size distribution is heavy-tailed (as is the case for Web file sizes). We introduce a new task assignment policy, called Size Interval Task Assignment with Variable Load (SITA-V). SITA-V purposely operates the server hosts at different loads, and directs smaller tasks to the lighter-loaded hosts. The result is that SITA-V provably decreases the mean task slowdown by significant factors (up to 1000 or more) where the more heavy-tailed the workload, the greater the improvement factor. We evaluate the tradeoff between improvement in slowdown and increase in waiting time in a system using SITA-V, and show conditions under which SITA-V represents a particularly appealing policy. We conclude with a discussion of the use of SITA-V in a distributed Web server, and show that it is attractive because it has a simple implementation which requires no communication from the server hosts back to the task router.
Resumo:
The current congestion-oriented design of TCP hinders its ability to perform well in hybrid wireless/wired networks. We propose a new improvement on TCP NewReno (NewReno-FF) using a new loss labeling technique to discriminate wireless from congestion losses. The proposed technique is based on the estimation of average and variance of the round trip time using a filter cal led Flip Flop filter that is augmented with history information. We show the comparative performance of TCP NewReno, NewReno-FF, and TCP Westwood through extensive simulations. We study the fundamental gains and limits using TCP NewReno with varying Loss Labeling accuracy (NewReno-LL) as a benchmark. Lastly our investigation opens up important research directions. First, there is a need for a finer grained classification of losses (even within congestion and wireless losses) for TCP in heterogeneous networks. Second, it is essential to develop an appropriate control strategy for recovery after the correct classification of a packet loss.
Resumo:
A foundational issue underlying many overlay network applications ranging from routing to P2P file sharing is that of connectivity management, i.e., folding new arrivals into the existing mesh and re-wiring to cope with changing network conditions. Previous work has considered the problem from two perspectives: devising practical heuristics for specific applications designed to work well in real deployments, and providing abstractions for the underlying problem that are tractable to address via theoretical analyses, especially game-theoretic analysis. Our work unifies these two thrusts first by distilling insights gleaned from clean theoretical models, notably that under natural resource constraints, selfish players can select neighbors so as to efficiently reach near-equilibria that also provide high global performance. Using Egoist, a prototype overlay routing system we implemented on PlanetLab, we demonstrate that our neighbor selection primitives significantly outperform existing heuristics on a variety of performance metrics; that Egoist is competitive with an optimal, but unscalable full-mesh approach; and that it remains highly effective under significant churn. We also describe variants of Egoist's current design that would enable it to scale to overlays of much larger scale and allow it to cater effectively to applications, such as P2P file sharing in unstructured overlays, based on the use of primitives such as scoped-flooding rather than routing.
Resumo:
In research areas involving mathematical rigor, there are numerous benefits to adopting a formal representation of models and arguments: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, broad accessibility has not been a priority in the design of formal verification tools that can provide these benefits. We propose a few design criteria to address these issues: a simple, familiar, and conventional concrete syntax that is independent of any environment, application, or verification strategy, and the possibility of reducing workload and entry costs by employing features selectively. We demonstrate the feasibility of satisfying such criteria by presenting our own formal representation and verification system. Our system’s concrete syntax overlaps with English, LATEX and MediaWiki markup wherever possible, and its verifier relies on heuristic search techniques that make the formal authoring process more manageable and consistent with prevailing practices. We employ techniques and algorithms that ensure a simple, uniform, and flexible definition and design for the system, so that it easy to augment, extend, and improve.
Resumo:
Comfort is, in essence, satisfaction with the environment, and with respect to the indoor environment it is primarily satisfaction with the thermal conditions and air quality. Improving comfort has social, health and economic benefits, and is more financially significant than any other building cost. Despite this, comfort is not strictly managed throughout the building lifecycle. This is mainly due to the lack of an appropriate system to adequately manage comfort knowledge through the construction process into operation. Previous proposals to improve knowledge management have not been successfully adopted by the construction industry. To address this, the BabySteps approach was devised. BabySteps is an approach, proposed by this research, which states that for an innovation to be adopted into the industry it must be implementable through a number of small changes. This research proposes that improving the management of comfort knowledge will improve comfort. ComMet is a new methodology proposed by this research that manages comfort knowledge. It enables comfort knowledge to be captured, stored and accessed throughout the building life-cycle and so allowing it to be re-used in future stages of the building project and in future projects. It does this using the following: Comfort Performances – These are simplified numerical representations of the comfort of the indoor environment. Comfort Performances quantify the comfort at each stage of the building life-cycle using standard comfort metrics. Comfort Ratings - These are a means of classifying the comfort conditions of the indoor environment according to an appropriate standard. Comfort Ratings are generated by comparing different Comfort Performances. Comfort Ratings provide additional information relating to the comfort conditions of the indoor environment, which is not readily determined from the individual Comfort Performances. Comfort History – This is a continuous descriptive record of the comfort throughout the project, with a focus on documenting the items and activities, proposed and implemented, which could potentially affect comfort. Each aspect of the Comfort History is linked to the relevant comfort entity it references. These three components create a comprehensive record of the comfort throughout the building lifecycle. They are then stored and made available in a common format in a central location which allows them to be re-used ad infinitum. The LCMS System was developed to implement the ComMet methodology. It uses current and emerging technologies to capture, store and allow easy access to comfort knowledge as specified by ComMet. LCMS is an IT system that is a combination of the following six components: Building Standards; Modelling & Simulation; Physical Measurement through the specially developed Egg-Whisk (Wireless Sensor) Network; Data Manipulation; Information Recording; Knowledge Storage and Access.Results from a test case application of the LCMS system - an existing office room at a research facility - highlighted that while some aspects of comfort were being maintained, the building’s environment was not in compliance with the acceptable levels as stipulated by the relevant building standards. The implementation of ComMet, through LCMS, demonstrates how comfort, typically only considered during early design, can be measured and managed appropriately through systematic application of the methodology as means of ensuring a healthy internal environment in the building.
Resumo:
Contemporary IT standards are designed, not selected. Their design enacts a complex process that brings together a coalition of players. We examine the design of the SOAP standard to discover activity patterns in this design process. The paper reports these patterns as a precursor to developing a micro-level process theory for designing IT standards.
Resumo:
According to EUSOMA position paper 'The requirements of a specialist breast unit', each breast unit should have a core team made up of health professionals who have undergone specialist training in breast cancer. In this paper, on behalf of EUSOMA, authors have identified the standards of training in breast cancer, to harmonise and foster breast care training in Europe. The aim of this paper is to contribute to the increase in the level of care in a breast unit, as the input of qualified health professionals increases the quality of breast cancer patient care.
Resumo:
Real decision makers exhibit significant shortcomings in the generation of objectives for decisions that they face. Prior research has illustrated the magnitude of this shortcoming but not its causes. In this paper, we identify two distinct impediments to the generation of decision objectives: not thinking broadly enough about the range of relevant objectives, and not thinking deeply enough to articulate every objective within the range that is considered. To test these explanations and explore ways of stimulating a more comprehensive set of objectives, we present three experiments involving a variety of interventions: the provision of sample objectives, organization of objectives by category, and direct challenges to do better, with or without a warning that important objectives are missing. The use of category names and direct challenges with a warning both led to improvements in the quantity of objectives generated without impacting their quality; other interventions yielded less improvement. We conclude by discussing the relevance of our findings to decision analysis and offering prescriptive implications for the elicitation of decision objectives. © 2010 INFORMS.
Resumo:
In some supply chains, materials are ordered periodically according to local information. This paper investigates how to improve the performance of such a supply chain. Specifically, we consider a serial inventory system in which each stage implements a local reorder interval policy; i.e., each stage orders up to a local basestock level according to a fixed-interval schedule. A fixed cost is incurred for placing an order. Two improvement strategies are considered: (1) expanding the information flow by acquiring real-time demand information and (2) accelerating the material flow via flexible deliveries. The first strategy leads to a reorder interval policy with full information; the second strategy leads to a reorder point policy with local information. Both policies have been studied in the literature. Thus, to assess the benefit of these strategies, we analyze the local reorder interval policy. We develop a bottom-up recursion to evaluate the system cost and provide a method to obtain the optimal policy. A numerical study shows the following: Increasing the flexibility of deliveries lowers costs more than does expanding information flow; the fixed order costs and the system lead times are key drivers that determine the effectiveness of these improvement strategies. In addition, we find that using optimal batch sizes in the reorder point policy and demand rate to infer reorder intervals may lead to significant cost inefficiency. © 2010 INFORMS.