936 resultados para Construction techniques
Resumo:
L’utilisation des mesures subjectives en épidémiologie s’est intensifiée récemment, notamment avec la volonté de plus en plus affirmée d’intégrer la perception qu’ont les sujets de leur santé dans l’étude des maladies et l’évaluation des interventions. La psychométrie regroupe les méthodes statistiques utilisées pour la construction des questionnaires et l’analyse des données qui en sont issues. Ce travail de thèse avait pour but d’explorer différents problèmes méthodologiques soulevés par l’utilisation des techniques psychométriques en épidémiologie. Trois études empiriques sont présentées et concernent 1/ la phase de validation de l’instrument : l’objectif était de développer, à l’aide de données simulées, un outil de calcul de la taille d’échantillon pour la validation d’échelle en psychiatrie ; 2/ les propriétés mathématiques de la mesure obtenue : l’objectif était de comparer les performances de la différence minimale cliniquement pertinente d’un questionnaire calculée sur des données de cohorte, soit dans le cadre de la théorie classique des tests (CTT), soit dans celui de la théorie de réponse à l’item (IRT) ; 3/ son utilisation dans un schéma longitudinal : l’objectif était de comparer, à l’aide de données simulées, les performances d’une méthode statistique d’analyse de l’évolution longitudinale d’un phénomène subjectif mesuré à l’aide de la CTT ou de l’IRT, en particulier lorsque certains items disponibles pour la mesure différaient à chaque temps. Enfin, l’utilisation de graphes orientés acycliques a permis de discuter, à l’aide des résultats de ces trois études, la notion de biais d’information lors de l’utilisation des mesures subjectives en épidémiologie.
Resumo:
Chapter 1 presents a brief note on the state at which the construction industry stands at present, bringing into focus the significance of the critical study. Relevance of the study, area of investigation and objectives of the study are outlined in this chapter. The 2nd chapter presents a review of the literature on the relevant areas. In the third chapter an analysis on time and cost overrun in construction highlighting the major factors responsible for it has been done. A couple of case studies to estimate loss to the nation on account of delay in construction have been presented in the chapter. The need for an appropriate estimate and a competent contractor has been emphasised for improving effectiveness in the project implementation. Certain useful equations and thoughts have been formulated on this area in this chapter that can be followed in State PWD and other Govt. organisations. Case studies on project implementation of major projects undertaken by Government sponsored/supported organizations in Kerala have been dealt with in Chapter 4. A detailed description of the project of Kerala Legislature Complex with a critical analysis has been given in this chapter. A detailed account of the investigations carried out on the construction of International Stadium, a sports project of Greater Cochin Development Authority is included here. The project details of Cochin International Airport at Nedumbassery, its promoters and contractors are also discussed in Chapter 4. Various aspects of implementation which led the above projects successful have been discussed in chapter 5. The data collected were analysed through discussion and perceptions to arrive at certain conclusions. The emergence of front-loaded contract and its impact on economics of the project execution are dealt with in this chapter. Analysis of delays in respect of the various project narrated in chapter 3 has been done here. The root causes of the project time and overrun and its remedial measures are also enlisted in this chapter. Study of cost and time overrun of any construction project IS a part of construction management. Under the present environment of heavy investment on construction activities in India, the consequences of mismanagement many a time lead to excessive expenditure which are not be avoidable. Cost consciousness, therefore has to be keener than ever before. Optimization in investment can be achieved by improved dynamism in construction management. The successful completion of coristruction projects within the specified programme, optimizing three major attributes of the process - quality, schedule and costs - has become the most valuable and challenging task for the engineer - managers to perform. So, the various aspects of construction management such as cost control, schedule control, quality assurance, management techniques etc. have also been discussed in this fifth chapter. Chapter 6 summarises the conclusions drawn from the above criticalr1 of rhajor construction projects in Kerala.
Resumo:
In developing techniques for monitoring the costs associated with different procurement routes, the central task is disentangling the various project costs incurred by organizations taking part in construction projects. While all firms are familiar with the need to analyse their own costs, it is unusual to apply the same kind of analysis to projects. The purpose of this research is to examine the claims that new ways of working such as strategic alliancing and partnering bring positive business benefits. This requires that costs associated with marketing, estimating, pricing, negotiation of terms, monitoring of performance and enforcement of contract are collected for a cross-section of projects under differing arrangements, and from those in the supply chain from clients to consultants, contractors, sub-contractors and suppliers. Collaboration with industrial partners forms the basis for developing a research instrument, based on time sheets, which will be relevant for all those taking part in the work. The signs are that costs associated with tendering are highly variable, 1-15%, depending upon what precisely is taken into account. The research to date reveals that there are mechanisms for measuring the costs of transactions and these will generate useful data for subsequent analysis.
Resumo:
Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.
Resumo:
Heat waves are expected to increase in frequency and magnitude with climate change. The first part of a study to produce projections of the effect of future climate change on heat-related mortality is presented. Separate city-specific empirical statistical models that quantify significant relationships between summer daily maximum temperature (T max) and daily heat-related deaths are constructed from historical data for six cities: Boston, Budapest, Dallas, Lisbon, London, and Sydney. ‘Threshold temperatures’ above which heat-related deaths begin to occur are identified. The results demonstrate significantly lower thresholds in ‘cooler’ cities exhibiting lower mean summer temperatures than in ‘warmer’ cities exhibiting higher mean summer temperatures. Analysis of individual ‘heat waves’ illustrates that a greater proportion of mortality is due to mortality displacement in cities with less sensitive temperature–mortality relationships than in those with more sensitive relationships, and that mortality displacement is no longer a feature more than 12 days after the end of the heat wave. Validation techniques through residual and correlation analyses of modelled and observed values and comparisons with other studies indicate that the observed temperature–mortality relationships are represented well by each of the models. The models can therefore be used with confidence to examine future heat-related deaths under various climate change scenarios for the respective cities (presented in Part 2).
Resumo:
Since its popularization in the 1980s, competitiveness has received close attention from practitioners and researchers across a wide range of industries. In the construction sector, many works on competitiveness have also been published. So far, however, there seems to be no comprehensive review to summarize and critique existing research on competitiveness in construction. This research, therefore, reviews the extant literature from four aspects: concept of competitiveness, competitiveness research at the construction industry level, competitiveness research at the firm level, and competitiveness research at the project level. The review presents the state-of-the-art development of competitiveness research in construction, identifies the research gaps, and proposes new directions for further studies. Further research is recommended to validate previous studies in construction practices, identify the mechanisms that encourage mutual enhancement of competitiveness at different levels, and how to achieve its sustainability by embracing new management and/or economics techniques.
Resumo:
In developing techniques for monitoring the costs associated with different procurement routes, the central task is disentangling the various project costs incurred by organizations taking part in construction projects. While all firms are familiar with the need to analyse their own costs, it is unusual to apply the same kind of analysis to projects. The purpose of this research is to examine the claims that new, ways of working such as strategic alliancing and partnering bring positive business benefits. This requires that costs associated with marketing, estimating, pricing, negotiation of terms, monitoring of performance and enforcement of contract are collected for a cross-section of projects under differing arrangements, and from those in the supply, chain from clients to consultants, contractors, subcontractors and suppliers. Collaboration with industrial partners forms the basis for developing a research instrument, bused on time sheets, which will be relevant for all those taking part in the work. The signs are that costs associated with,with tendering are highly variable, 1-15%, depending upon what precisely, is taken into account. The research to date reveals that there are mechanisms for measuring the costs of transactions and these will generate useful data for subsequent analysis.
Resumo:
The effects and influence of the Building Research Establishment’s Environmental Assessment Methods (BREEAM) on construction professionals are examined. Most discussions of building assessment methods focus on either the formal tool or the finished product. In contrast, BREEAM is analysed here as a social technology using Michel Foucault’s theory of governmentality. Interview data are used to explore the effect of BREEAM on visibilities, knowledge, techniques and professional identities. The analysis highlights a number of features of the BREEAM assessment process which generally go unremarked: professional and public understandings of the method, the deployment of different types of knowledge and their implication for the authority and legitimacy of the tool, and the effect of BREEAM on standard practice. The analysis finds that BREEAM’s primary effect is through its impact on standard practices. Other effects include the use of assessment methods to defend design decisions, its role in both operationalizing and obscuring the concept of green buildings, and the effect of tensions between project and method requirements for the authority of the tool. A reflection on assessment methods as neo-liberal tools and their adequacy for the promotion of sustainable construction suggests several limitations of lock-in that hinder variation and wider systemic change.
Resumo:
Khartoum like many cities in least developing countries (LDCs) still witnesses huge influx of people. Accommodation of the new comers leads to encroachment on the cultivation land leads to sprawl expansion of Greater Khartoum. The city expanded in diameter from 16.8 km in 1955 to 802.5 km in 1998. Most of this horizontal expansion was residential. In 2008 Khartoum accommodated 29% of the urban population of Sudan. Today Khartoum is considered as one of 43 major cities in Africa that accommodates more than 1 million inhabitants. Most of new comers live in the outskirts of the city e.g. Dar El-Salam and Mayo neighbourhoods. The majority of those new comers built their houses especially the walls from mud, wood, straw and sacks. Selection of building materials usually depends on its price regardless of the environmental impact, quality, thermal performance and life of the material. Most of the time, this results in increasing the cost with variables of impacts over the environment during the life of the building. Therefore, consideration of the environmental impacts, social impacts and economic impacts is crucial in the selection of any building material. Decreasing such impacts could lead to more sustainable housing. Comparing the sustainability of the available wall building materials for low cost housing in Khartoum is carried out through the life cycle assessment (LCA) technique. The purpose of this paper is to compare the most available local building materials for walls for the urban poor of Khartoum from a sustainability point of view by going through the manufacturing of the materials, the use of these materials and then the disposal of the materials after their life comes to an end. Findings reveal that traditional red bricks couldn’t be considered as a sustainable wall building material that will draw the future of the low cost housing in Greater Khartoum. On the other hand, results of the comparison lead to draw attention to the wide range of the soil techniques and to its potentials to be a promising sustainable wall material for urban low cost housing in Khartoum.
Resumo:
Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.
Resumo:
Identifying the correct sense of a word in context is crucial for many tasks in natural language processing (machine translation is an example). State-of-the art methods for Word Sense Disambiguation (WSD) build models using hand-crafted features that usually capturing shallow linguistic information. Complex background knowledge, such as semantic relationships, are typically either not used, or used in specialised manner, due to the limitations of the feature-based modelling techniques used. On the other hand, empirical results from the use of Inductive Logic Programming (ILP) systems have repeatedly shown that they can use diverse sources of background knowledge when constructing models. In this paper, we investigate whether this ability of ILP systems could be used to improve the predictive accuracy of models for WSD. Specifically, we examine the use of a general-purpose ILP system as a method to construct a set of features using semantic, syntactic and lexical information. This feature-set is then used by a common modelling technique in the field (a support vector machine) to construct a classifier for predicting the sense of a word. In our investigation we examine one-shot and incremental approaches to feature-set construction applied to monolingual and bilingual WSD tasks. The monolingual tasks use 32 verbs and 85 verbs and nouns (in English) from the SENSEVAL-3 and SemEval-2007 benchmarks; while the bilingual WSD task consists of 7 highly ambiguous verbs in translating from English to Portuguese. The results are encouraging: the ILP-assisted models show substantial improvements over those that simply use shallow features. In addition, incremental feature-set construction appears to identify smaller and better sets of features. Taken together, the results suggest that the use of ILP with diverse sources of background knowledge provide a way for making substantial progress in the field of WSD.
Resumo:
Generalized hyper competitiveness in the world markets has determined the need to offer better products to potential and actual clients in order to mark an advantagefrom other competitors. To ensure the production of an adequate product, enterprises need to work on the efficiency and efficacy of their business processes (BPs) by means of the construction of Interactive Information Systems (IISs, including Interactive Multimedia Documents) so that they are processed more fluidly and correctly.The construction of the correct IIS is a major task that can only be successful if the needs from every intervenient are taken into account. Their requirements must bedefined with precision, extensively analyzed and consequently the system must be accurately designed in order to minimize implementation problems so that the IIS isproduced on schedule and with the fewer mistakes as possible. The main contribution of this thesis is the proposal of Goals, a software (engineering) construction process which aims at defining the tasks to be carried out in order to develop software. This process defines the stakeholders, the artifacts, and the techniques that should be applied to achieve correctness of the IIS. Complementarily, this process suggests two methodologies to be applied in the initial phases of the lifecycle of the Software Engineering process: Process Use Cases for the phase of requirements, and; MultiGoals for the phases of analysis and design. Process Use Cases is a UML-based (Unified Modeling Language), goal-driven and use case oriented methodology for the definition of functional requirements. It uses an information oriented strategy in order to identify BPs while constructing the enterprise’s information structure, and finalizes with the identification of use cases within the design of these BPs. This approach provides a useful tool for both activities of Business Process Management and Software Engineering. MultiGoals is a UML-based, use case-driven and architectural centric methodology for the analysis and design of IISs with support for Multimedia. It proposes the analysis of user tasks as the basis of the design of the: (i) user interface; (ii) the system behaviour that is modeled by means of patterns which can combine Multimedia and standard information, and; (iii) the database and media contents. This thesis makes the theoretic presentation of these approaches accompanied with examples from a real project which provide the necessary support for the understanding of the used techniques.
Resumo:
The way of organization of the constitutional jurisdiction implies the possibility to extend the democratization of the same one in function of the popular participation in the active legitimacy to constitutional process (procedimentalist model) e, at the same time, to assure technical viable decisions fast and to the complex problems of the constitucional law (substancialist model). The comparison with the constitutional jurisdiction of U.S.A. becomes interesting from the knowledge of the wide power to decide experience of Supreme the Court that for a methodology of construction of rights and not simply of interpretation of the Constitution, brought up to date and reconstructed throughout its historical evolution the direction of the norms of basic rights and the North American principles constitutional. Construction while constitutional hermeneutic method of substancialist matrix works with techniques as the measurement of principles, the protection of interests of minorities and the entailing of the basic rights with values politicians, what it can be brought to evidence of the Brazilian constitutional jurisdiction in order to improve the construction of basic rights that comes being carried through for the judicial ativism in control of the diffuse and abstract constitutionality. To define the limits of construction is to search, on the other hand, a dialogue with the procedimentalists thesis, aiming at the widening of the participation of the citizen in the construction of the basic rights for the constitutional process and to argue forms of the society to evaluate the pronounced decisions activist in the controls diffuse and abstract of constitutionality
Resumo:
We present the exact construction of Riemannian (or stringy) instantons, which are classical solutions of 2D Yang-Mills theories that interpolate between initial and final string configurations. They satisfy the Hitchin equations with special boundary conditions. For the case of U(2) gauge group those equations can be written as the sinh-Gordon equation with a delta-function source. Using the techniques of integrable theories based on the zero curvature conditions, we show that the solution is a condensate of an infinite number of one-solitons with the same topological charge and with all possible rapidities.