17 resultados para design-based survey sampling

em Instituto Politécnico do Porto, Portugal


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives : The purpose of this article is to find out differences between surveys using paper and online questionnaires. The author has deep knowledge in the case of questions concerning opinions in the development of survey based research, e.g. the limits of postal and online questionnaires. Methods : In the physician studies carried out in 1995 (doctors graduated in 1982-1991), 2000 (doctors graduated in 1982-1996), 2005 (doctors graduated in 1982-2001), 2011 (doctors graduated in 1977-2006) and 457 family doctors in 2000, were used paper and online questionnaires. The response rates were 64%, 68%, 64%, 49% and 73%, respectively. Results : The results of the physician studies showed that there were differences between methods. These differences were connected with using paper-based questionnaire and online questionnaire and response rate. The online-based survey gave a lower response rate than the postal survey. The major advantages of online survey were short response time; very low financial resource needs and data were directly loaded in the data analysis software, thus saved time and resources associated with the data entry process. Conclusions : The current article helps researchers with planning the study design and choosing of the right data collection method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O âmbito deste trabalho envolve o teste do modelo BIM numa obra em construção pela Mota-Engil – Engenharia, na extração experimental de peças desenhadas de preparação e apoio à execução de obra. No capítulo 1 deste relatório são definidos o âmbito e os objetivos deste trabalho, é feito um enquadramento histórico do tema e abordados conceitos e atividades da preparação de obra, na sua forma tradicional. O estado do conhecimento da preparação de obras e mais em concreto da tecnologia BIM a nível nacional e internacional é abordado no capítulo 2. Nesse sentido procura-se definir os conceitos principais inerentes a esta nova metodologia, que passa por identificar e caraterizar a tecnologia envolvida e o seu nível de desenvolvimento. Com suporte em casos práticos de preparação de obra na sua forma tradicional, identificados e desenvolvidos no capítulo 3, foi compilado um processo tipo de peças desenhadas de suporte identificadas e caracterizadas no capítulo 4, frequentes e comuns à execução de diversos tipos de obras de edifícios. Assente na compilação baseada em casos práticos e no estudo do projeto de execução da empreitada que sustenta o presente trabalho, com base no qual o modelo BIM foi concebido, identificou-se um conjunto de peças desenhadas de preparação e apoio à execução dos trabalhos, em 2D, a extrair do modelo. No capítulo 5, é feita uma descrição do modo como foi estudado o projeto da obra, com evidência para os fatores mais relevantes, especificando os desenhos a extrair. Suportada pelo programa de modelação ArchiCAD, a extração do conjunto de desenhos identificados anteriormente foi conseguida com recurso às funcionalidades disponíveis no software, que permite a criação de desenhos 2D atualizáveis ou não automaticamente a partir do modelo. Qualquer alteração introduzida no modelo virtual é automaticamente atualizada nos desenhos bidimensionais, caso o utilizador assim o pretenda. Ao longo desse trabalho foram detetados e analisados os condicionalismos inerentes ao processo de extração, referidos no capítulo 6, para estabelecimento de regras de modelação padrão a adotar em futuras empreitadas, que possam simplificar a obtenção dos elementos desenhados de preparação necessários à sua execução. No ponto 6.3 são identificadas melhorias a introduzir no modelo. Em conclusão no capítulo 7 são abordadas especificidades do setor da construção que sustentam e evidenciam cada vez mais a necessidade de utilizar as novas tecnologias com vista à adoção de práticas e ferramentas padrão de apoio à execução de obras. Sendo a tecnologia BIM, transversal a todo o setor, a sua utilização com regras padrão na conceção dos modelos e na extração de dados, potencia a otimização dos custos, do tempo, dos recursos e da qualidade final de um empreendimento, ao longo de todo o seu ciclo de vida, para além de apoiar com elevada fiabilidade as tomadas de decisão ao longo desse período. A tecnologia BIM, possibilita a antevisão do edifício a construir com um elevado grau de pormenor, com todas as vantagens que daí advêm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

March 19 - 22, 2006, São Paulo, BRAZIL World Congress on Computer Science, Engineering and Technology Education

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a project consisting on the development of an Intelligent Tutoring System, for training and support concerning the development of electrical installation projects to be used by electrical engineers, technicians and students. One of the major goals of this project is to devise a teaching model based on Intelligent Tutoring techniques, considering not only academic knowledge but also other types of more empirical knowledge, able to achieve successfully the training of electrical installation design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Urban Computing (UrC) provides users with the situation-proper information by considering context of users, devices, and social and physical environment in urban life. With social network services, UrC makes it possible for people with common interests to organize a virtual-society through exchange of context information among them. In these cases, people and personal devices are vulnerable to fake and misleading context information which is transferred from unauthorized and unauthenticated servers by attackers. So called smart devices which run automatically on some context events are more vulnerable if they are not prepared for attacks. In this paper, we illustrate some UrC service scenarios, and show important context information, possible threats, protection method, and secure context management for people.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

When exploring a virtual environment, realism depends mainly on two factors: realistic images and real-time feedback (motions, behaviour etc.). In this context, photo realism and physical validity of computer generated images required by emerging applications, such as advanced e-commerce, still impose major challenges in the area of rendering research whereas the complexity of lighting phenomena further requires powerful and predictable computing if time constraints must be attained. In this technical report we address the state-of-the-art on rendering, trying to put the focus on approaches, techniques and technologies that might enable real-time interactive web-based clientserver rendering systems. The focus is on the end-systems and not the networking technologies used to interconnect client(s) and server(s).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Radio link quality estimation in Wireless Sensor Networks (WSNs) has a fundamental impact on the network performance and also affects the design of higher-layer protocols. Therefore, for about a decade, it has been attracting a vast array of research works. Reported works on link quality estimation are typically based on different assumptions, consider different scenarios, and provide radically different (and sometimes contradictory) results. This article provides a comprehensive survey on related literature, covering the characteristics of low-power links, the fundamental concepts of link quality estimation in WSNs, a taxonomy of existing link quality estimators, and their performance analysis. To the best of our knowledge, this is the first survey tackling in detail link quality estimation in WSNs. We believe our efforts will serve as a reference to orient researchers and system designers in this area.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Variations of manufacturing process parameters and environmental aspects may affect the quality and performance of composite materials, which consequently affects their structural behaviour. Reliability-based design optimisation (RBDO) and robust design optimisation (RDO) searches for safe structural systems with minimal variability of response when subjected to uncertainties in material design parameters. An approach that simultaneously considers reliability and robustness is proposed in this paper. Depending on a given reliability index imposed on composite structures, a trade-off is established between the performance targets and robustness. Robustness is expressed in terms of the coefficient of variation of the constrained structural response weighted by its nominal value. The Pareto normed front is built and the nearest point to the origin is estimated as the best solution of the bi-objective optimisation problem.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An approach for the analysis of uncertainty propagation in reliability-based design optimization of composite laminate structures is presented. Using the Uniform Design Method (UDM), a set of design points is generated over a domain centered on the mean reference values of the random variables. A methodology based on inverse optimal design of composite structures to achieve a specified reliability level is proposed, and the corresponding maximum load is outlined as a function of ply angle. Using the generated UDM design points as input/output patterns, an Artificial Neural Network (ANN) is developed based on an evolutionary learning process. Then, a Monte Carlo simulation using ANN development is performed to simulate the behavior of the critical Tsai number, structural reliability index, and their relative sensitivities as a function of the ply angle of laminates. The results are generated for uniformly distributed random variables on a domain centered on mean values. The statistical analysis of the results enables the study of the variability of the reliability index and its sensitivity relative to the ply angle. Numerical examples showing the utility of the approach for robust design of angle-ply laminates are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Structural health monitoring has long been identified as a prominent application of Wireless Sensor Networks (WSNs), as traditional wired-based solutions present some inherent limitations such as installation/maintenance cost, scalability and visual impact. Nevertheless, there is a lack of ready-to-use and off-the-shelf WSN technologies that are able to fulfill some most demanding requirements of these applications, which can span from critical physical infrastructures (e.g. bridges, tunnels, mines, energy grid) to historical buildings or even industrial machinery and vehicles. Low-power and low-cost yet extremely sensitive and accurate accelerometer and signal acquisition hardware and stringent time synchronization of all sensors data are just examples of the requirements imposed by most of these applications. This paper presents a prototype system for health monitoring of civil engineering structures that has been jointly conceived by a team of civil, and electrical and computer engineers. It merges the benefits of standard and off-the-shelf (COTS) hardware and communication technologies with a minimum set of custom-designed signal acquisition hardware that is mandatory to fulfill all application requirements.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The interest in the development of climbing robots has grown rapidly in the last years. Climbing robots are useful devices that can be adopted in a variety of applications, such as maintenance and inspection in the process and construction industries. These systems are mainly adopted in places where direct access by a human operator is very expensive, because of the need for scaffolding, or very dangerous, due to the presence of an hostile environment. The main motivations are to increase the operation efficiency, by eliminating the costly assembly of scaffolding, or to protect human health and safety in hazardous tasks. Several climbing robots have already been developed, and other are under development, for applications ranging from cleaning to inspection of difficult to reach constructions. A wall climbing robot should not only be light, but also have large payload, so that it may reduce excessive adhesion forces and carry instrumentations during navigation. These machines should be capable of travelling over different types of surfaces, with different inclinations, such as floors, walls, or ceilings, and to walk between such surfaces (Elliot et al. (2006); Sattar et al. (2002)). Furthermore, they should be able of adapting and reconfiguring for various environment conditions and to be self-contained. Up to now, considerable research was devoted to these machines and various types of experimental models were already proposed (according to Chen et al. (2006), over 200 prototypes aimed at such applications had been developed in the world by the year 2006). However, we have to notice that the application of climbing robots is still limited. Apart from a couple successful industrialized products, most are only prototypes and few of them can be found in common use due to unsatisfactory performance in on-site tests (regarding aspects such as their speed, cost and reliability). Chen et al. (2006) present the main design problems affecting the system performance of climbing robots and also suggest solutions to these problems. The major two issues in the design of wall climbing robots are their locomotion and adhesion methods. With respect to the locomotion type, four types are often considered: the crawler, the wheeled, the legged and the propulsion robots. Although the crawler type is able to move relatively faster, it is not adequate to be applied in rough environments. On the other hand, the legged type easily copes with obstacles found in the environment, whereas generally its speed is lower and requires complex control systems. Regarding the adhesion to the surface, the robots should be able to produce a secure gripping force using a light-weight mechanism. The adhesion method is generally classified into four groups: suction force, magnetic, gripping to the surface and thrust force type. Nevertheless, recently new methods for assuring the adhesion, based in biological findings, were proposed. The vacuum type principle is light and easy to control though it presents the problem of supplying compressed air. An alternative, with costs in terms of weight, is the adoption of a vacuum pump. The magnetic type principle implies heavy actuators and is used only for ferromagnetic surfaces. The thrust force type robots make use of the forces developed by thrusters to adhere to the surfaces, but are used in very restricted and specific applications. Bearing these facts in mind, this chapter presents a survey of different applications and technologies adopted for the implementation of climbing robots locomotion and adhesion to surfaces, focusing on the new technologies that are recently being developed to fulfill these objectives. The chapter is organized as follows. Section two presents several applications of climbing robots. Sections three and four present the main locomotion principles, and the main "conventional" technologies for adhering to surfaces, respectively. Section five describes recent biological inspired technologies for robot adhesion to surfaces. Section six introduces several new architectures for climbing robots. Finally, section seven outlines the main conclusions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The content of a Learning Object is frequently characterized by metadata from several standards, such as LOM, SCORM and QTI. Specialized domains require new application profiles that further complicate the task of editing the metadata of learning object since their data models are not supported by existing authoring tools. To cope with this problem we designed a metadata editor supporting multiple metadata languages, each with its own data model. It is assumed that the supported languages have an XML binding and we use RDF to create a common metadata representation, independent from the syntax of each metadata languages. The combined data model supported by the editor is defined as an ontology. Thus, the process of extending the editor to support a new metadata language is twofold: firstly, the conversion from the XML binding of the metadata language to RDF and vice-versa; secondly, the extension of the ontology to cover the new metadata model. In this paper we describe the general architecture of the editor, we explain how a typical metadata language for learning objects is represented as an ontology, and how this formalization captures all the data required to generate the graphical user interface of the editor.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

eLearning has been evolved in a gradual and consistent way. Along with this evolution several specialized and disparate systems appeared to fulfill the needs of teachers and students such as repositories of learning objects, intelligent tutors, or automatic evaluators. This heterogeneity poses issues that are necessary to address in order to promote interoperability among systems. Based on this fact, the standardization of content takes a leading role in the eLearning realm. This article presents a survey on current eLearning content standards. It gathers information on the most emergent standards and categorizes them according three distinct facets: metadata, content packaging and educational design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Power systems have been experiencing huge changes mainly due to the substantial increase of distributed generation (DG) and the operation in competitive environments. Virtual Power Players (VPP) can aggregate several players, namely a diversity of energy resources, including distributed generation (DG) based on several technologies, electric storage systems (ESS) and demand response (DR). Energy resources management gains an increasing relevance in this competitive context. This makes the DR use more interesting and flexible, giving place to a wide range of new opportunities. This paper proposes a methodology to support VPPs in the DR programs’ management, considering all the existing energy resources (generation and storage units) and the distribution network. The proposed method is based on locational marginal prices (LMP) values. The evaluation of the impact of using DR specific programs in the LMP values supports the manager decision concerning the DR use. The proposed method has been computationally implemented and its application is illustrated in this paper using a 33-bus network with intensive use of DG.